MyPage is a personalized page based on your interests.The page is customized to help you to find content that matters you the most.

I'm not curious

Java Big Data Developer (Remote Initially)

Location Philadelphia, United States
Posted 19-August-2022

Position: Java-Big Data Developer

Location: USA-Philadelphia (Remote Initially)

Long Term Contract

Skills: Java, Big Data, Spark, Scala


Client is looking for talented and creative ETL Developer who takes responsibility and ownership in providing software solutions and contributing to the overall success of the teamResponsibilities: Hands-on architecture/development of ETL pipelines using our internal framework written in Java

Hands-on architecture of real time REST APIs or other solutions for streaming data from Graph using Spark

Interpret data, analyze results using statistical techniques and provide ongoing reports

Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality

Acquire data from primary or secondary data sources and maintain databases/data systems

Identify, analyze, and interpret trends or patterns in complex data sets

Filter and clean data by reviewing reports and performance indicators to locate and correct problems

Work with management to prioritize business and information needs

Locate and define new process improvement opportunities


At least 8+ years of experience architecting and implementing complex ETL pipelines preferably with Spark toolset.

At least 4+ years of experience with Java particularly within the data space

Technical expertise regarding data models, database design development, data mining and segmentation techniques

Good experience writing complex SQL and ETL processes

Excellent coding and design skills, particularly in Java/Scala and Python and or Java.

Experience working with large data volumes, including processing, transforming and transporting large-scale data

Experience in AWS technologies such as EC2, Redshift, Cloud formation, EMR, AWS S3, AWS Analytics required.

Big data related AWS technologies like HIVE, Presto, Hadoop required.

AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data

Excellent working knowledge of Apache Hadoop, Apache Spark, Kafka, Scala, Python etc.

Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy

Good understanding & usage of algorithms and data structures

Good Experience building reusable frameworks.

Experience working in an Agile Team environment.

AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data

Excellent communication skills both verbal and written


Sandip Kumar

Team Lead

+1.307.316.7223 |

Noralogic | 109 E 17th St, Cheyenne WY 82001


Mexico: Guadalajara, Monterrey

India: Noida UP

WBE and MBE company

ISO 9001:2015

WY Top 50 Minority owned growing company

Related Posts:

AWS (Amazon Web Services)


Extract Transform Load (ETL)












Big Data






Awards & Accolades for MyTechLogy
Winner of
Top 100 Asia
Finalist at SiTF Awards 2014 under the category Best Social & Community Product
Finalist at HR Vendor of the Year 2015 Awards under the category Best Learning Management System
Finalist at HR Vendor of the Year 2015 Awards under the category Best Talent Management Software
Hidden Image Url