Skip to content
Snippets Groups Projects

Apache Spark with Picodata JDBC Example

Requirements

  • Git and Git LFS plugin
  • JDK 11+
  • Scala
  • Docker or Podman

Running this example

  1. Ensure that you have the Git LFS plugin installed. For example, for RedHat-like systems you can use dnf:
dnf install git-lfs
  1. Checkout this repository using Git.
git clone https://git.picodata.io/picodata/picodata/examples/-/tree/master/picodata-jdbc-spark-example
  1. Fetch large files from Git (they are not downloaded by default with the previous command).
git lfs fetch origin master
  1. Go to src/main/resources directory and run the Docker containers with example Picodata cluster:
docker-compose up -d
  1. Create new Picodata user for JDBC driver in the container:
docker-compose exec picodata-1 bash -c "echo -ne \"CREATE USER \\\"sqluser\\\" WITH PASSWORD 'P@ssw0rd' USING md5;\nGRANT CREATE TABLE TO \\\"sqluser\\\";\" | picodata admin /var/lib/picodata/picodata-1/admin.sock"
  1. Execute the following command in the repository root directory:
$ sh ./gradlew build
  1. Run the application:
$ sh ./gradlew run

It will compile and run the application.