i wrote the following code to integrate with flyte...
# databricks-integration
i wrote the following code to integrate with flyte and databricks, could you please help me to resolve it import datetime import random,json from operator import add import flytekit from flytekit import Resources, task, workflow from pyspark import SparkContext from operator import add from flytekitplugins.spark import Spark from airflow.providers.databricks.operators.databricks import DatabricksRunNowOperator @task( task_config=DatabricksRunNowOperator(task_id='notebook_run', tasks=[{ "spark.driver.memory": "1000M", "spark.executor.memory": "1000M", "spark.executor.cores": "1", "spark.executor.instances": "2", "spark.driver.cores": "1", "new_cluster": {"spark_version": "2.1.0-db3-scala2.11", "num_workers": 2}, "notebook_task": {"notebook_path": "/Users/airflow@example.com/PrepareData"}, }])) def hello_spark(partitions: int) -> float: print("Starting Spark with Partitions: {}".format(partitions)) # n = 100000 * partitions sess = flytekit.current_context().spark_session
you need to write the code for the execution?
how do you send the execution to databricks
though preferred is a backend plugin, this is not required