C2C Hiring Big Data Engineer contract jobs in Location: San Francisco, CA

Job Title: Big Data Engineer
Location: San Francisco, CA
Client: Apple

Job Description:

Apple is seeking a talented and driven Big Data Engineer to join our dynamic team in San Francisco, CA. As a Big Data Engineer at Apple, you will be responsible for designing, implementing, and optimizing data systems to manage, process, and analyze large-scale datasets that are critical to Apple’s innovative products and services.

Key Responsibilities:

  • Design, develop, and maintain large-scale data systems and pipelines to process and analyze massive amounts of structured and unstructured data.
  • Implement efficient, fault-tolerant, and scalable data architectures using big data technologies such as Hadoop, Spark, Kafka, and others.
  • Work closely with data scientists, analysts, and software engineers to understand data requirements and deliver solutions that support data-driven decision-making.
  • Build and maintain data integrations and ETL pipelines to collect, clean, and transform data from various sources.
  • Optimize data processing workflows to ensure high performance and low-latency data pipelines.
  • Collaborate with cross-functional teams to define data strategies, architecture, and frameworks that support the company’s data-driven vision.
  • Troubleshoot, debug, and resolve technical issues related to data workflows and infrastructure.
  • Stay up-to-date with emerging trends and technologies in big data and cloud computing to enhance system capabilities.

Required Skills and Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or a related field (Master's preferred).
  • 5+ years of experience in big data engineering, data architecture, or related fields.
  • Proficiency in big data technologies (Hadoop, Spark, Kafka, etc.).
  • Experience with programming languages such as Java, Python, or Scala.
  • Strong experience with SQL and NoSQL databases (e.g., MySQL, MongoDB, Cassandra).
  • Familiarity with cloud platforms (AWS, Google Cloud, Azure) and tools for data processing and storage (e.g., S3, Redshift, BigQuery).
  • Proven experience in designing and maintaining scalable and high-performance data pipelines.
  • Knowledge of data warehousing concepts and data modeling.
  • Strong problem-solving and debugging skills, with the ability to work under pressure to deliver results.
  • Excellent communication skills and the ability to work collaboratively in a team-oriented environment.

Preferred Qualifications:

  • Experience with data streaming technologies (e.g., Apache Flink, Apache Beam).
  • Experience with containerization technologies (Docker, Kubernetes).
  • Familiarity with machine learning and AI frameworks.
  • Knowledge of DevOps practices and continuous integration/continuous deployment (CI/CD).

Regards,

 

Krishna

RECRUIT WORX

Email: – krishna@recruitworx.in

——————– US STAFFING ESSENTIALS ————————————–
For daily US JOBS / Updated Hotlist / Post hotlist / Vendor Lists from the trusted sources 

For Everything in US Staffing JusSearch on google ” C2C HOTLIST ” for daily 5000+ US JOBS and Updated 10000+ Hotlists.

Have you Checked this No.1 US Staffing Whatsapp Channel for Daily C2C Jobs/ Hotlists and Top US staffing Telegram Channel of 50k American vendors


Linkdin || Telegram || Whatsapp

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *