Back to Jobs

GCP Big Data Developer- Remote

TheHiveCareers

Remotefull time2 days ago
Apply Now

Primary & Must have

Python/PySpark & GCP Cloud Must with experience

Hadoop Ecosystem Exposure (knowledge on HDFS, Hive, Big data)

SQL (Strong in SQL as this is the base to whatever we do in HQL)

CLOUD working experience- GCP preferred
Job Description
Total IT exp of 5-12 years
Knowledge of Cloud (GCP).
Analyze and organize raw data
Build data systems and pipelines

Evaluate business needs and objectives
Interpret trends and patterns

Conduct complex data analysis and report on results

Prepare data for prescriptive and predictive modeling
Build algorithms and prototypes

Combine raw information from different sources

Explore ways to enhance data quality and reliability

Identify opportunities for data acquisition

Develop analytical tools and programs

Collaborate with data scientists and architects on several projects

Technical expertise with data models, data mining, and segmentation techniques

Knowledge of programming languages (e.g. Java and Python)

Hands-on experience with SQL database design

Great numerical and analytical skills
Soft Skills:
Good communication skills

Flexible to work and learn on new technologies

via JSearch
FREE WEEKLY NEWSLETTER

Stay on the Nerd Track

One email per week — courses, deep dives, tools, and AI experiments.

No spam. Unsubscribe anytime.