Apache Spark with Picodata JDBC Example
Requirements
- Git and Git LFS plugin
- JDK 11+
- Scala
- Docker or Podman
Running this example
- Ensure that you have the Git LFS plugin installed. For example, for RedHat-like systems you can use
dnf
:
dnf install git-lfs
- Checkout this repository using Git.
git clone https://git.picodata.io/picodata/picodata/examples/-/tree/master/picodata-jdbc-spark-example
- Fetch large files from Git (they are not downloaded by default with the previous command).
git lfs fetch origin master
- Go to
src/main/resources
directory and run the Docker containers with example Picodata cluster:
docker-compose up -d
- Create new Picodata user for JDBC driver in the container:
docker-compose exec picodata-1 bash -c "echo -ne \"CREATE USER \\\"sqluser\\\" WITH PASSWORD 'P@ssw0rd' USING md5;\nGRANT CREATE TABLE TO \\\"sqluser\\\";\" | picodata admin /var/lib/picodata/picodata-1/admin.sock"
- Execute the following command in the repository root directory:
$ sh ./gradlew build
- Run the application:
$ sh ./gradlew run
It will compile and run the application.