Job Opportunity at eClerx: Process Manager
Posting Date: 26/07/2024
Department: Process Management
About the Team
eClerx provides critical business operations services to over fifty global Fortune 500 clients, including some of the world’s leading companies across financial services, cable & telecom, retail, fashion, media & entertainment, manufacturing, travel & leisure, software, and high-tech sectors. Incorporated in 2000, eClerx is one of India’s leading process management and data analytics companies and is publicly traded on both the Bombay and National Stock Exchanges of India. eClerx employs over 9,500 people across its global sites in the US, UK, India, Italy, Germany, Singapore, and Thailand.
eClerx Digital is the trusted partner of choice to the world’s largest global brands for creative production, eCommerce/web operations, and analytics & insight services. We improve profitability for their digital businesses using the Follow the Sun delivery model. Our team of 3,000+ full-time digital delivery employees across five production hubs in Mumbai, Pune, Chandigarh, Verona, and Phuket apply deep digital expertise to support the most demanding global clients effectively. eClerx Digital’s innovative delivery model drives the 'metrics that matter' for our clients: improved acquisition, conversion, retention, and overall lifetime value of your customer 24/7/365.
Job Description
The ideal candidate will possess knowledge relevant to the functional area, act as a subject matter expert, and focus on continuous improvement for maximum efficiency. It is vital to maintain high standards of delivery excellence, provide top-notch service quality, and develop successful long-term business partnerships with internal/external customers. The candidate should be proactive, able to break down complex problems systematically, generate and compare multiple options, and set priorities to resolve problems. Strong communication skills are essential for explaining organizational objectives and motivating the team.
Responsibilities:
- Designing and implementing scalable, reliable, and maintainable data architectures on AWS.
- Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments.
- Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc.
- Integrating AWS data solutions with existing systems and third-party services.
- Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval.
- Implementing data security and encryption best practices in AWS environments.
- Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed.
- Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions.
Technical and Functional Skills:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering and AWS cloud environments.
- Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc.
- Proficiency in programming languages like Python, SQL, Scala, or Java.
- Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/Amazon Redshift.
- Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue).
- Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts.
- Understanding of big data technologies such as Hadoop, Spark, Kafka, and their integration with AWS.
- Proficiency in version control tools like Git and infrastructure as code (e.g., CloudFormation, Terraform).
- Ability to analyze complex technical problems and propose effective solutions.
- Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.