BI

Cloud Data Engineer - AWS & Python

Brains Workgroup, Inc.
DevOpsOnsite • New York, New-York, East 78th Street 1$141k-181kPosted 19 days ago

Job Description

We are a leading bank based in New York City, seeking a Cloud Data Engineer for a permanent position that offers a competitive salary package ranging from $135,000 to $145,000, alongside exceptional benefits and a target bonus. Our Cloud Data Engineer will engage in various facets of data engineering, encompassing data design, development, testing, debugging, documentation, deployment, and ongoing production support. This role will require working on-site 2 to 3 days a week in our New York City office, providing an exciting opportunity to contribute to impactful data solutions in the financial sector.

- Minimum of 3 years of experience with Databricks and PySpark for large-scale data processing in a financial services context.
- Over 5 years of solid programming expertise in Python and advanced SQL.
- Practical experience with Airflow for orchestrating workflows, including the design and management of complex DAGs.
- Familiarity with cloud platforms (Azure or AWS), including data lake architectures and associated services such as Azure Data Lake Storage or AWS S3.
- Understanding of financial domain data and regulatory standards, with experience in managing sensitive financial information.
- Strong problem-solving capabilities and communication skills, able to collaborate effectively in a team setting.

- Design, develop, and enhance scalable data pipelines on Databricks leveraging PySpark to handle large quantities of financial data for real-time analytics and reporting.
- Construct ETL pipelines for both structured and semi-structured financial data sourced from batch and streaming processes using tools like Apache Kafka, Airflow, and SQL.
- Create Gold Layer transformations in Databricks for refined, high-quality datasets that support business intelligence and analytics.
- Work in collaboration with cross-functional teams, including data scientists, analysts, and business stakeholders, to provide high-quality data solutions that fulfill business needs.
- Apply DevOps best practices for data engineering workflows, integrating CI/CD pipelines to ensure effective and dependable data processing.
- Maintain data quality, governance, and compliance with financial regulations to uphold accurate financial reporting.
- Automate workflows utilizing Airflow for scheduling and orchestration, enhancing operational efficiency.
- Fine-tune Spark jobs and SQL queries for optimal performance and cost efficiency, ensuring timely and economic data processing.

More DevOps Jobs

DevOpsSource: DevITJobsOnsite • Abbott Park, Illinois, Abbott Park Road$99k-198k
2 days ago
DevOpsSource: DevITJobsOnsite • Washington, District-Of-Columbia, Ellipse Road Northwest$115k-125k
2 days ago
DevOpsSource: DevITJobsOnsite • New York, New-York, Broadway 260$125k-165k
2 days ago