Data Systems Engineer (AWS, Snowflake, RedShift, Python, Scala, Hadoop, Spark, Kafka, Hive, API, Handling, API Development, Data Migration, Batch Data Pipelines) in Charlotte, NC
API Development, AWS, AWS Lambda, Data Migration, Hadoop, Java, Oracle, Snowflake, SQL Server
Location: North Carolina
Job Function: Data Engineer
Date Of Job Posting: 08-19-2021
Data Systems Engineer (AWS, Snowflake, RedShift, Python, Scala, Hadoop, Spark, Kafka, Hive, API,Handling, API Development, Data Migration, Batch Data Pipelines) in Charlotte, NC
Position: Data Systems Engineer (AWS, Snowflake, RedShift, Python, Scala, Hadoop, Spark, Kafka, Hive, API,Handling, API Development, Data Migration, Batch Data Pipelines)
Location: Charlotte, NC
Duration: Full-Time ONSITEposition (no contracts, no corp to corp, no remote)
Salary: Excellent Compensation with benefits + 401K
SKILLS: AWS, Glue, Lambda, EMR, Snowflake, RedShift, Python, Scala, Java, Hadoop, Spark, Kafka, Hive, API Handling, API Development,Custom Data pipeline development,Data Migration, Oracle, SQL server, Batch Data Pipelines, Agile
THE OPPORTUNITY:
For one of our reputed client in Financial Space, we have an immediate need for a Data Systems Engineer to be based in Charlotte, North Carolina.
Our Ideal Data Systems Engineer will have skills emphasis on data management, data processing, and developing framework for scalable data infrastructure solutions to integrate with heterogeneous data sources. The candidate will work closely with Program Managers, Engineers, Data scientists, Reporting team and other key parts of the business to understand their data requirements and build appropriate systems and platform that meet or exceed those specific business functions. You must be willing to get into the details but also be able to step back and help us plan the strategic direction of our data management and reporting practices.
THE ROLE:
·Utilize multiple development languages/tools such as Java, Python, Scala and object-oriented approaches in building prototypes and evaluate results for effectiveness and feasibility.
·Design, develop, test, and implement data-driven solutions to meet business requirements, ability to quickly identify an opportunity and recommend possible technical solutions by working with third party vendors.
·Provide business analysis and develop ETL code to meet all technical specifications and business requirements according to the established architectural designs.
·Extracting business data from multiple structured and unstructured data sources, utilizing data pipeline to ingest data in Enterprise Data Lake - hybrid environment.
·Deploy application code and analytical models using CI/CD tools and techniques and provides support for deployed data applications and analytical models using Jenkins, GitHub.
·Willing to take ownership of pipeline and can communicate concisely and persuasively to varied audience including data provider, engineering, and analysts.
·Ability to research and assess open-source technologies and components to recommend and integrate into the design and implementation.
REQUIREMENTS:
·Bachelor's degree in computer science, information systems or relevant field of study.
·Master's degree preferred
·Experienced in Cloud Technologies: AWS, Glue, Lambda, EMR and Snowflake/RedShift database
·Must have related technical experience and specialist level knowledge of Python, Scala or Java
·Also should have experience with Hadoop ecosystem and Big Data technologies: Spark, AWS, Kafka, Hive, API Handling, CDH
·Experience in API Development and Handling Experience in Custom Data pipeline development (Cloud and in-house local) and migrate data from large-scale data environments - Oracle, SQL server with experience in end-to-end design and build process of Near-Real Time and Batch Data Pipelines.
·Demonstrated ability to work with team members and clients to assess needs, aid and resolve problems.
·Excellent problem-solving skills, verbal/written communication, and the ability to explain technical concepts to business partners.
·Partner with Development teams to ensure Coding standards are in alignment with DevOps practices with respect to Tools, Standards, and Security.
·Self-motivated and capable of delivering results with minimal ongoing direction. Ability to work in a fast-paced environment and manage multiple priorities in parallel.
·Automation mind-set - drive to continuously look for ways to automate existing processes.
·Experience working on Agile Scrum teams.
--------------------------------------------------
Apply for this job...Job Description : Symphony Medical is a multi-specialty physician-led private practice with an expanding Obstetrics and Gynecology service... ...We are seeking a full-time, board-eligible or board-certified OB/GYN Physician for our growing practice in Westchester County, just...
...The Department of Chemical and Biological Engineering (ChBE) at Tufts University invites applications for the Robert and Marcy Haber Endowed Professorship in Energy Sustainability faculty position. Applicants will be considered at the full Professor rank but exceptional...
...The Village of New Concord, Ohio is seeking two Water/Sewer Treatment Plan Operators. New Concord, founded in 1828, is a vibrant college community and the home of Muskingum University. New Concord is also the childhood home of John and Annie Glenn. Its population is...
...job that's has flexible hours?Are you passionate about animals and enjoy staying active?Earn money to exercise and play with pets! Laughing Pets Atlanta is an award-winning pet care company founded in 2010. Join our team and you'll be excited to go to work! This...
...Scientists to investigate and plan experiments, prototype new technology,... ...to deliver high quality software against aggressive schedules... ...team We are passionate engineers and scientists dedicated to... ...QUALIFICATIONS - 3+ years of non-internship professional software...