Azure Data Solutions Architect
Remote
Xebia is in need of a leading technical contributor who can consistently take a business or technical problem, work it to a well-defined data problem/specification, present the solution to peers and execute it at a high level. They have a strong focus on metrics, both for the impact of their work and for its inner workings/operations. They are a model for the team on best practices for software development in general (and data engineering in particular), including code quality, documentation, DevOps practices, and testing, and consistently mentor junior members of the team. They ensure the robustness of our services and serve as an escalation point in the operation of existing services, pipelines, and workflows.
This lead should demonstrate core engineering knowledge/experience of industry technologies, practices, and frameworks, e.g. Databricks, Kubernetes, ArgoCD, ADO, Azure Message Bus and PubSub, CICD, OpenTelementry, networking principles and scaling applications.
They must be experts in working closely and collaborating with near and offshore delivery teams.
Primary responsibilities include the following:
Using Azure or GCP cloud services and a propietary data platform tools to ingest, egress, and transform data from multiple sources.
Confidently optimizes the design and execution of complex solutions in data ingestion and data transformation using established pattern or improving those pattern.
Produces well-engineered software, including appropriate automated test suites, technical documentation, and operational strategy.
Provides input into the roadmaps, e.g. to, Data Platforms and other Data Engineering Teams, to help improve the overall program of work.
Ensure consistent application of platform capabilities to ensure quality and consistency concerning logging and lineage.
Fully versed in coding best practices and ways of working, and participates in code reviews and partnering to implement established standards in the team and to improve those standards if needed.
Adhere to QMS framework, Security & Regulatory Standards, and CI/CD best practices and helps to guide improvements to them that improve ways of working. Provide leadership to team members to help others get the job done right.
Supporting engineering teams in the adoption and creation of data mesh best practices.
Maintains best practices for engineering and architecture on our Confluence site.
Pro-actively engages in experimentation and innovation to drive relentless improvement
Provides leadership, technical input to architecture and engineering teams.
Basic Qualifications:
We are looking for professionals with these required skills to achieve our goals:
BS in Computer Science, Software Engineering, biomedical engineering, engineering, or bioinformatics/computational biology, with 4+ years of experience (or MS with 2+ years of experience, or PhD) in the biotech/pharmaceutical/ healthcare/diagnostics/health insurance space.
Extensive architecture, coding and testing experience, excellent teamwork.
Proficient with at least 3 of the below skills and can demonstrate knowledge and value with relevant experience in all the following competencies: Data Engineering development, architecture design & technology platforms/frameworks.
Hands-on experience with Azure Data Analytics services e.g. ADLS, Azure Data Factory, Azure.
Databricks, Purview, Azure Synapse, etc.
Data Platforms and Domain-driven design.
Agile, DevOps & Automation [of testing, build, deployment, CI/CD, etc.]
Data analytics & data quality/integrity.
Testing strategies & frameworks.
Kubernetes and ArgoCD/FluxCD.
Role requires:
Has soft-skill to lead a larger data engineering team.
Demonstrated skill in delivering high-quality engineered data products.
Knowledge of industry standards and technology platforms.
Excellent communication, negotiation, influencing, and stakeholder management skills.
Customer focus and excellent problem-solving skills.
Familiarity with and use of various cloud ecosystems including BigQuery, DataBricks, KeyVaults, ObjectStores, etc.
Good understanding of various software paradigms: domain-driven, procedural, data-driven, object-oriented, functional.
Deep knowledge in Python.
Demonstrable knowledge depth in more than one area of software engineering and technology.
Good to have Qualifications:
If you have the following characteristics, it would be a plus:
Experience in data structures (i.e. information management), data models or relational database design.
Background in biomedical data processing is a plus.
Experience in GenAI and Agentic AI.
Subject matter expertise in Pharma CMC and scientific domains.
Experience in applying data curation, virtualization, workflow, and advanced visualization techniques to enable decision support across multiple products and assets to drive results across R&D business operations.
...WEEKENDS ONLY!!! Phoenix Home Care & Hospice is currently seeking a weekend Hospice RN to join our team! POSITION DETAILS: Friday, Saturday and Sunday - 12 hour shifts Weekend PRN DUTIES AND RESPONSIBILITIES Coordinates patient/family care...
Company/Location:Encompass Communications and Learning Remote PositionTitle:Project CoordinatorReports to: Executive Project ManagerAbout Encompass Communications and LearningAt Encompass, we believe training should inspire, engage, and empower. We...
Associate Attorney - Complex Litigation Commercial | Business | Bankruptcy | Real Estate and more About the Role gpac is partnered with a top-tier litigation firm known for its sharp legal minds and proven success in high-stakes disputes. This is a prime opportunity...
...DESCRIPTION AWS Game-Based Learning Products is looking for a Game Development Engineer... ...simulations to get customers to develop practical skills using AWS services.... ...AWS Partners, AWS Sovereign Cloud, AWS International Product, and the Generative AI Innovation...
...Job Details Job Title: Cloud Architect (Data Lake/Data Bricks) SME Location: Raleigh, NC / Hybrid (4 Days Onsite/1 Day Remote)... ...Cloud environments Strong experience supporting cloud services (compute, network, databases, etc.) Experience using Databricks or...