Reset
Candidates
6935 results
user name can't see
AWS Data Engineer
Chicago, Illinois
Corp To Corp
Immediate
Posted On 03/19/2025
user name can't see
Oracle Fusion HCM Consultant
Woodbridge, New Jersey
W2 - Contract
Immediate
Remote
Posted On 02/18/2026
user name can't see
Data Scientist
Carrollton, Texas
Corp To Corp
Immediate
Posted On 02/12/2026
user name can't see
Senior Data Engineer
McKinney, Texas
W2 - Contract
Immediate
Remote
Posted On 01/09/2026
user name can't see
Java Developer
McKinney, TX 70570
Corp To Corp
Immediate
Remote
Posted On 01/09/2026
user name can't see
.NET+ ANGULAR+AZURE Developer
Radcliff,, North Carolina
Corp To Corp
Immediate
Posted On 04/29/2025
user name can't see
Cloud Security Engineer
Houston, Texas
Corp To Corp
Immediate
Remote
Posted On 12/16/2025
Business Logo
Accentuate IT Solutions LLC
VK
user name can't see
Bench Recruiter
Accentuate IT Solutions LLC
Herndon, VA

VS
user name can't see
Created On 03/19/2025
|
Candidate ID:CID10100178

AWS Data Engineer

Available
Chicago, Illinois, US
Pay Rate : 60 - 80 USD / hour
Experience : 8 YearsAvailability : ImmediateEngagement Type: Corp To Corp
Onsite
Security Clearance
Resume

Candidate Details
Primary Skills
Spark, Python, Jupyter notebooks, AWS S3 Metabase, MySQL"
Python, Flask and Django Frameworks
Tableau
Secondary Skills
SQL
Microsoft SQL Server Database Administration (DBA)
Big Data
Experience
8 Years

Candidate Summary

·        An IT professional with 8+ years of experience as a Data Engineer and extensively worked with designing, developing, and implementing Data models for enterprise-level applications and BI solutions.

·        Experience in designing and building Data Management Lifecycle covering Data Ingestion, Data integration, Data consumption, Data delivery, and integration Reporting, Analytics, and System-System integration.

·        Proficient in Big Data environment and Hands-on experience in utilizing Hadoop environment components for large-scale data processing including structured and semi-structured data.

·        Strong experience with all phases including Requirement Analysis, Design, Coding, Testing, Support and Documentation using Apache Spark & Scala, Python, HDFS, YARN, Sqoop, Hive, Map Reduce, KAFKA.

·        Extensive experience with Azure cloud technologies like Azure Data Lake Storage, Azure Data Factory, Azure SQL, Azure Data Warehouse, Azure Synapse Analytical, Azure Analytical Services, Azure HDInsight and Databricks.

·        Solid Knowledge of AWS services like AWS EMR, Redshift, S3, EC2, Lambda, Glue and concepts, configuring the servers for auto-scaling and elastic load balancing.

·        Experience with monitoring the web services using Hadoop and Spark for controlling the applications and analyzing their operation and performance.

·        Experienced in Python data manipulation for loading and extraction as well as with Python libraries such as NumPy, Pandas, matplotlib, seaborn, sklearn and SciPy for data analysis and numerical computations.

·        Experience in the development and design of various scalable systems using Hadoop technologies in various environments and analyzing data using MapReduce, Hive, and PIG.

·        Hands-on use of Spark and Scala to compare the performance of Spark with Hive and SQL, and Spark SQL to manipulate Data Frames in Scala.

·        Strong knowledge in working with ETL methods for data extraction, transformation, and loading in corporate- wide ETL Solutions and Data Warehouse tools for reporting and data analysis.

·        Experience with different ETL tool environments like SSIS, Informatica, and reporting tool environments like SQL Server Reporting Services, Power BI and Business Objects.

·        Experience in deployment of applications and scripting using the Unix/Linux Shell scripting.

·        Solid knowledge of Data Marts, Operational Data Store, OLAP, Dimensional Data Modeling with Star Schema Modeling, Snowflake Modeling for Dimensions Tables using Analysis Services.

·        Proficiency in writing complex SQL, PL/SQL for creating tables, views, indexes, stored procedures, functions.

·        Knowledge and experience with CI/CD using containerization technologies like Docker and Jenkins.