PySpark Jobs

筛选

我最近的搜索
筛选项:
预算
类型
技能
语言
    工作状态
    7 份搜到的工作,货币单位为 USD

    Need Developer for PySparks coding in Databricks

    $4 / hr (Avg Bid)
    $4 / hr 平均报价
    2 个竞标
    Pyspark + Python 5 天 left
    已验证

    Overall 4-9 years of IT exp including Java, Python … etc • Min 2-3 years of exp in Spark (either Python, Java and Scala). Ensure that they have live exp in Spark. • Good knowledge on GCP, Airflow and Data Proc 1 Core Python and Spark 2 Java and Spark 3 Python and PySpark

    $1642 (Avg Bid)
    $1642 平均报价
    7 个竞标
    pyspark support -- 3 4 天 left
    已验证

    i need support for my pyspark project

    $90 (Avg Bid)
    $90 平均报价
    8 个竞标

    Entrada: tupla (id,termo) em que "id" é o identificador do documento e "termo" é uma palavra do texto já pré-processada. (Pseudocod/Python/PySpark/Spark)

    $100 (Avg Bid)
    $100 平均报价
    2 个竞标

    Looking for someone who has at least 4/5 years experience in Big Data field and hands on experience on PySpark, HDFS, Hive, Impala, Shell scripting, SQL, HQL and scheduling tool like Autosys/Airflow. This is a long term project and we will pay monthly basis.

    $1136 (Avg Bid)
    $1136 平均报价
    11 个竞标

    Looking for someone who has a depth knowledge in Unix/Shell Scripting (bash script), PySpark, HDFS, Hive, Impala and any Scheduling tools like CA Automic or Control-M or Autosys.

    $1126 (Avg Bid)
    $1126 平均报价
    14 个竞标

    I have a server (debian 10) with docker container for airflow and spark. Both are in the same network. I also installed a spark provider in airflow. However I am not able to run a SparkSubmitOperator task in airflow. Keeps getting error. Needs somebody to take a look at the setup and identify the issue. Or suggestion of better configuration.

    $39 / hr (Avg Bid)
    $39 / hr 平均报价
    3 个竞标