Job Summary
Synechron is seeking a highly experienced Data Engineer to develop, optimize, and support scalable data pipelines and architectures supporting enterprise analytics, data migration, and compliance requirements. This role involves working with big data tools, cloud platforms, and data security frameworks to ensure high-quality, reliable, and secure data solutions. Your leadership will improve operational data management and support strategic decision-making in regulated environments.
Software Requirements
Required:
Strong experience with SQL (MySQL, PostgreSQL, Oracle) supporting data validation, transformation, and performance tuning (6+ years)
Expertise in big data tools: Apache Spark, Hadoop, Hive, supporting large data processing workloads (6+ years)
Proven experience building ETL/ELT pipelines using tools such as Informatica, Talend, or similar (6+ years)
Hands-on experience supporting cloud data platforms like AWS (Glue, S3, Redshift), Azure, or GCP supporting data integration and migration (preferred)
Experience supporting data security policies, encryption, and compliance frameworks such as GDPR or HIPAA
Preferred:
Knowledge of NoSQL databases like MongoDB or Cassandra
Familiarity with cloud-native data services and automation frameworks (Terraform, CloudFormation)
Overall Responsibilities
Design, develop, and optimize ETL/ELT workflows supporting enterprise analytics and data migration
Build and maintain resilient big data architectures in cloud environments supporting large data volumes
Collaborate with data scientists, BI teams, and business analysts to translate requirements into scalable data solutions
Conduct data validation, reconciliation, and security assessments supporting compliance standards
Support data migration strategies, migrations, and upgrades supporting operational and regulatory needs
Implement and uphold data governance, security, and privacy policies supporting compliance initiatives
Monitor data pipeline performance, troubleshoot issues, and optimize for reliability and security
Lead efforts in automating data workflows, infrastructure provisioning, and documentation of data architectures
Technical Skills (By Category)
Languages & Data Tools (Essential):
SQL (MySQL, PostgreSQL, Oracle supporting data validation and performance tuning)
Apache Spark (PySpark, Spark SQL supporting large-scale data processing)
ETL tools: Informatica, Talend, or similar (supporting complex pipelines)
Preferred:
Databases & Data Management:
Relational databases: MySQL, Oracle, PostgreSQL
Data warehousing and data governance tools supporting compliance standards
Cloud & Infrastructure:
AWS (Glue, S3, Redshift), Azure, or GCP cloud data services support (preferred)
Infrastructure as Code: Terraform, CloudFormation
Data & Security:
Data encryption, access controls, GDPR, HIPAA compliance, and data privacy protocolssupporting industry standards
Experience Requirements
6+ years working with big data ecosystems, supporting large-scale and high-availability data pipelines
Proven experience designing ETL workflows, supporting migration and data validation in cloud environments
Experience supporting regulated industries, ensuring data security and compliance requirements are met
Demonstrated ability to troubleshoot, optimize, and automate large data processes
Day-to-Day Activities
Develop and maintain ETL/ELT workflows for data ingestion, transformation, and analytics
Conduct performance analysis, optimize existing data pipelines for cost and efficiency
Support data migration projects and cloud migration activities
Ensure data security, encryption, and governance compliance
Troubleshoot data pipeline issues, conduct root cause analysis, and implement solutions
Collaborate with data analysts, data scientists, and application teams to validate data quality
Automate data workflows and infrastructure support processes
Document data architecture, data flows, security policies, and operational procedures
Qualifications
Bachelor’s or Master’s in Data Engineering, Computer Science, or related fields
6+ years of experience developing large-scale data pipelines and supporting data migration projects in cloud environments
Certifications such as AWS Data Analytics, GCP Professional Data Engineer, or Azure Data Engineer (preferred)
Proven experience supporting secure, compliant, and high-performance data solutions in enterprise environments
Professional Competencies
Strong analytical and troubleshooting skills supporting complex data workflows
Effective communication for stakeholder engagement and cross-team collaboration
Leadership qualities to guide junior team members and enforce best practices
Adaptability to evolving data security standards, cloud platforms, and industry regulations
Focus on operational security, data quality, and system reliability supporting business needs
SYNECHRON’S DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Candidate Application Notice