Data Architect at
35K - 41K PLN / month
Remote
Kraków, Poland
Employment Contract
Skills
Big Data
Data Warehousing
ETL
Python
Description
Job Description
Essential Duties:

  • Maintain our overall data architecture strategy, including data models, data integration, data governance, and data quality standards.
  • Design and implement scalable and efficient databases, data warehouses, and data lakes to support our needs and analytics projects. Experience with dimensional data modeling, RDBMS and NoSQL platforms.
  • Manage the full life cycle of data warehouse and big data solutions. This includes creating the requirements analysis, the platform selection, design of the technical architecture, design of the application design, testing, and deployment of the proposed solution.
  • Collaborate with partners, such as business analysts, data scientists, and IT teams, to understand data requirements and translate them into data solutions.
  • Benchmark systems, analyze system bottlenecks and propose solutions to eliminate them.
  • Evaluate and select appropriate technologies to support our data architecture strategy.
  • Guide/Mentor/Help develop the proposed architectural solution.
  • Lead performance tuning and optimization activities of the data systems
  • Lead the team in infrastructure setup phases.
  • Take end-to-end ownership for solution components and bring in design best practices and tools. Define data integration and ETL (Extract, Transform, Load) processes to ensure smooth data flow between systems and data sources.
  • Establish and enforce data governance practices, including data security, data privacy, and data access controls.
Skills:
  • 8 years work experience as a data architect (enterprise level)
  • Minimum 5+ years' experience in Data Warehouse design for complex Data Platform. Experience with star schemas, dimensional modeling, and extract transform load (ETL) design.
  • Knowledge of data modeling using Kimbal and Inmon methodologies
  • Recent 5+ years in Big Data Ecosystem. Expertise in cloud-based data platforms, such as AWS OR GCP. In-depth knowledge of data modeling concepts and techniques, including relational, dimensional, and NoSQL data models.
  • Python expert responsible for designing, coding development projects related to data pipelines that create data for valuable insights.
  • Expert understanding of architectural and data warehouse design principles and data integration.
  • Expert with data integration technologies like Kafka and Spark.
  • Knowledgeable of any of the ETL technologies like Pentaho, SSIS, Informatica or DataStage.
  • Experience working with RDBMS like Oracle, SQL Server, MySQL and Postgres.
  • Knowledge of data governance practices, data security, and privacy regulations (e.g., GDPR, CCPA).
About company
As part of Concentrix Software Solutions, you have the opportunity to shape the technological future of the CX industry. Join a team that runs projects for the world's leading brands using the latest IT tools and is at the forefront of innovation.

Find your dream job 🚀

Get recommended to multiple jobs and make your profile visible to top company recruiters
Highlight your strengths and get matched for culture fit
Get free access to job search tools
Gyfted 2021, Palo Alto, CA 94305. All rights reserved.