Projects

Successful Projects

Some examples of work we have completed.
A 3d swirl
dbt
Snowflake
Qlik

Data Stack Modernisation at Johnson & Johnson (J&J)

Client Background

Johnson & Johnson Innovative Medicine (J&J) is a leading pharmaceutical company dedicated to advancing healthcare through innovative solutions and patient-centric care. With a focus on research, development, and commercialisation of pharmaceutical products, J&J relies heavily on data-driven insights to drive decision-making across various business functions.

Challenge

Johnson & Johnson's data infrastructure ("Ensemble 1.0") struggled with outdated technology, hindering agility and efficiency. Key pain points included:

  • Limited BI access:
    Qlik Sense offered restricted data exploration for analysts.
  • Data quality issues:
    Inconsistent data pipelines led to unreliable insights.
  • Slow processing times:
    Long refresh cycles hampered timely decision-making.

Our Mission

We were tasked with leading the UK migration to Ensemble 2.0, a modern data stack established for the EMEA region addressing these challenges. Our responsibilities focused on:

  • Migrating from Qlik Sense to dbt:
    Spearheading the transition from Qlik's proprietary interface to dbt's SQL-based data transformation, empowering analysts with direct data modelling and code-driven workflows.
  • Seamless Snowflake integration:
    Leveraging Airbyte, data sources were seamlessly integrated into Snowflake, the cloud-based data warehouse at the heart of Ensemble 2.0. This ensured scalable storage, efficient processing, and unified access to diverse data sets.
  • Orchestrating data pipelines with Airflow:
    We designed and implemented data pipelines using Airflow, automating data ingestion, transformation, and loading into Snowflake. This ensured reliable data delivery and minimised manual intervention.

Impact

The Ensemble 2.0 transformation delivered significant improvements:

  • Empowered analytics: dbt provided analysts with greater flexibility and control over data exploration, unlocking deeper insights.
  • Enhanced data quality: Streamlined pipelines and robust testing in dbt significantly improved data accuracy and consistency.
  • Enhanced maintainability: Qlik Sense transformations were mostly on an application by application basis, new data modelling techniques enabled new consolidated models to feed all BI tools.

Beyond the code

This project wasn't just about technology. We actively collaborated with stakeholders across the organisation:

  • Understanding business needs: Gathering business requirements and ensuring the new data stack aligned with strategic goals.
  • Change management: We championed adoption of new tools and processes through training, fostering user confidence and buy-in.
  • DataOps: Bringing DataOps to the UK organisation, including CI/CD, Linting, Pre-commit, Automated Testing and development standards.
An iphone on a desk
IoT
GraphQL
Kafka

Ship Sensor Data Platform for Navy Digital

Client Background

The Royal Navy, a vital branch of the UK Ministry of Defence, operates a diverse fleet of ships across the globe. The development of applications demanded real-time access to accurate sensor data to ensure crew safety, optimise resource allocation, and maintain a high level of situational awareness. However, their existing data infrastructure in some areas was siloed, outdated, and unable to meet these modern demands.

Challenge

Faced with these limitations, Navy Digital, the digital transformation arm of the Ministry of Defence, proposed a modern data platform to capture, process, and expose ship sensor data. This platform needed to provide real-time insights to applications used by the crew, enabling them to make informed decisions and enhance operational efficiency.

Our Mission

As architectural lead at Navy Digital, I led the design and implementation of a new data platform, focusing on:

  • Ingesting sensor data:
    Architecting a scalable data pipeline using Apache Nifi and Kafka on Kubernetes to capture and stream sensor data from diverse sources in real-time.
  • Stream processing and enrichment:
    Designing data processing pipelines to cleanse, transform, and enrich sensor data using stream processing frameworks, extracting valuable insights for applications.
  • GraphQL API exposure:
    Defining and implementing a GraphQL API using open-source technologies, providing a flexible and intuitive interface for crew applications to access and utilise sensor data.

Impact

The new data platform would deliver significant benefits:

  • Enhanced situational awareness: Real-time sensor data empowered crew with clear insights into ship operations and the surrounding environment, improving decision-making capabilities.
  • Increased operational efficiency: Streamlined data access and analysis enabled faster response times and optimised resource allocation, leading to improved mission effectiveness.
  • Modern and scalable architecture: The Kubernetes-based infrastructure provided scalability and flexibility to accommodate future data growth and changing requirements.

Expertise applied

This project wasn't just about technology. We actively collaborated with stakeholders across the organisation:

  • Stakeholder engagement: We actively liaised with senior Navy stakeholders, translating their needs into technical requirements and ensuring solution alignment with strategic goals.
  • Design leadership: We owned the high-level design, facilitated design reviews, and fostered collaboration among architects and engineers, ensuring a unified and high-quality solution.
  • Project delivery: We played a key role in leading the project to successful implementation, overcoming challenges and ensuring timely delivery.
  • Solution architecture: Designing and implementing complex distributed data platforms for real-time data pipelines and API exposure.
  • Technology selection and integration: Choosing and integrating cutting-edge open-source technologies like Apache Kafka, Kubernetes, and GraphQL.