Find Jobs
Hire Freelancers

Spark Script for Hive to S3 Data Migration

$30-250 AUD

已关闭
已发布28 天前

$30-250 AUD

货到付款
Create a Spark script to transfer metastore data from Hive to S3 - Create a connection to Hive metastore - Fetch [login to view URL] definition for the database - Create a connection to S3 bucket - Create a new [login to view URL] within S3 Hive metastore - Transfer data from Hive metastore to S3 - Configure multiple [login to view URL] creation based on config variables - Create recursive data transfer based on difference in data Skills and Experience: - Proficiency in Spark and Hive - Extensive experience with S3 buckets - Understanding of data backup strategies Project Details: - The script needs to read the schema and perform metadata transfer for selected schema to s3 bucket. - Only bid if you have work experience with spark, hive, s3. - multiple schemas needs to be migrated. - I have local instance of netapp s3 available and bucket created.
项目 ID: 38029405

关于此项目

5提案
远程项目
活跃22 天前

想赚点钱吗?

在Freelancer上竞价的好处

设定您的预算和时间范围
为您的工作获得报酬
简要概述您的提案
免费注册和竞标工作
5威客以平均价$150 AUD来参与此工作竞价
用户头像
Hello, I have 10 years of experience in Spark and AWS S3. I will create a spark script which read the schema and perform metadata transfer for selected schema to s3 bucket. Regards, VishnuLal
$250 AUD 在3天之内
5.0 (4条评论)
3.5
3.5
用户头像
With over 2 years of hands-on experience in Pyspark and AWS, I specialize in developing Python scripts and Pyspark code tailored for bank environments. My expertise lies in leveraging these technologies to efficiently process and analyze data, ensuring robust and scalable solutions for banking operations.
$120 AUD 在7天之内
0.0 (0条评论)
0.0
0.0
用户头像
Hi, I have expertise in spark, hive, s3 with 9+ years on the same. lets connect to discuss on it. Thanks
$140 AUD 在7天之内
0.0 (0条评论)
0.0
0.0
用户头像
Hello, how are you? Thank you for the job posting. With a robust background in Spark, Hive, and S3, I am well-equipped to undertake your metadata transfer project. My experience includes crafting Spark scripts for seamless data migration, leveraging Hive for efficient schema management, and managing S3 buckets for optimal data storage. I understand the nuances of data backup strategies, ensuring the integrity and security of your information throughout the transfer process. I am confident in my ability to create a flexible and efficient script that accommodates multiple schemas, adhering to your project specifications. My past work in similar projects underscores my capability to deliver results promptly and reliably. I look forward to the opportunity to contribute to your project's success. Best regards, Darko Djokic
$140 AUD 在7天之内
0.0 (0条评论)
0.0
0.0
用户头像
With five years of experience as a data analyst, I am excited to propose my expertise for the development of a Spark script to migrate data from Hive to an S3 bucket. My extensive experience includes working on Spark scripts for data migration from various sources like Hive, PostgreSQL, MySQL etc. My recent project also involved migrating data from Hive to an S3 bucket, where I successfully optimized storage for improved efficiency. This hands-on experience ensures that I am well-equipped to handle the challenges and requirements of this project effectively.
$100 AUD 在4天之内
0.0 (0条评论)
0.0
0.0

关于客户

AUSTRALIA的国旗
Middle Park, Australia
5.0
21
付款方式已验证
会员自9月 9, 2009起

客户认证

谢谢!我们已通过电子邮件向您发送了索取免费积分的链接。
发送电子邮件时出现问题。请再试一次。
已注册用户 发布工作总数
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
加载预览
授予地理位置权限。
您的登录会话已过期而且您已经登出,请再次登录。