Solution Architect: Data Platforms
-
- Software Engineering
- Professional
Solution Architect: Data Platforms
-
- Software Engineering
- Professional
In this role, you’ll work in one of our IBM Consulting Client Innovation Centres (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
At IBM, work is more than a job – it’s a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you’ve never thought possible. Are you ready to lead in this new era of technology and solve some of the world’s most challenging problems? If so, lets talk.
Your Role and Responsibilities
- Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform
- Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation
- Contribute to reusable components / asset / accelerator development to support capability development
- Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies
- Participate in customer PoCs to deliver the outcomes
- Participate in delivery reviews / product reviews, quality assurance and work as design authority
Required Technical and Professional Expertise
- Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems
- Experience in data engineering and architecting data platforms
- Experience in architecting and implementing Data Platforms Azure Cloud Platform
- Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow
- Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks
Preferred Technical and Professional Expertise
- Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem
- Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric
- Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Apprenez-en plus à notre sujet
About IBM
IBM’s greatest invention is the IBMer. We believe that through the application of intelligence, reason and science, we can improve business, society and the human condition, bringing the power of an open hybrid cloud and AI strategy to life for our clients and partners around the world.
Restlessly reinventing since 1911, we are not only one of the largest corporate organizations in the world, we’re also one of the biggest technology and consulting employers, with many of the Fortune 50 companies relying on the IBM Cloud to run their business.
At IBM, we pride ourselves on being an early adopter of artificial intelligence, quantum computing and blockchain. Now it’s time for you to join us on our journey to being a responsible technology innovator and a force for good in the world.
Principaux détails du poste
Rejoignez notre réseau de talents.
Soyez au courant des possibilités de carrière qui correspondent à vos compétences et à vos intérêts.