- Strong Python Skills - Proficient in Python for data manipulation, automation, and building reusable components.
- Data Pipeline Development - Experience designing and maintaining ETL/data pipelines using tools like Airflow or custom scripts.
- Database Knowledge - Hands-on with SQL and NoSQL databases such as PostgreSQL, MySQL, and MongoDB.
- Data Analysis - Skilled in data wrangling and analysis using Pandas, NumPy, and similar libraries.
- ETL Expertise - Capable of building scalable ETL workflows for large structured and unstructured datasets.
- Cloud Platform Exposure - Familiar with AWS, Azure, or GCP for data storage, processing, and deployment.
- API Integration - Experience consuming and integrating RESTful APIs for data exchange.
- Big Data Tools (Preferred) - Exposure to PySpark, Hadoop, or Kafka for handling large-scale data. CI/CD & DevOps - Proficient with Git, Jenkins, and Docker for code versioning and automated deployment.
Role: Data Engineer
Industry Type: IT Services & Consulting
Department: Engineering - Software & QA
Employment Type: Full Time, Permanent
Role Category: Software Development
About company
DXC Technology (NYSE: DXC) helps global companies run their mission-critical systems and operations while modernizing IT, optimizing data architectures, and ensuring security and scalability across public, private and hybrid clouds. The worlds largest companies and public sector organizations trust DXC to deploy services to drive new levels of performance, competitiveness, and customer experience across their IT estates.