We support a wide variety of industries
You have a lot of data? Call us!
Healthcare
Healthcare
The Client is a multinational company that develops medical devices and pharmaceuticals. They needed an advanced analytics platform to allow hospital staff to run applications based on a unified source of information. Datumo built the solution from scratch with the platform being based on Azure, using AKS and Databricks. It supports multiple use-cases, is easily scalable and abides by strict compliance rules in the medtech industry. The applications gather data from robotic devices from a variety of locations which improves decision-making during surgeries and treatment.
E-commerce 1
E-commerce
This project is conducted for a Client who hosts one of the largest e-commerce platforms in Europe. The main goal is to enable scalability and rapid development of existing processes by migrating core Big Data applications & infrastructure from the on-premise Hadoop environment to Google Cloud Platform in a cost-efficient way. The major challenges include designing and optimizing Big Data workflows using Apache Spark, Dataproc, BigQuery & Cloud Composer. Our team is responsible for the migration process and mentoring other teams on data-related matters.
E-commerce 2
E-commerce
merXu is an online trading platform for enterprises from the CEE region. To allow smooth B2B online trading on a national and international scale, accurate business decisions are crucial. Thus the analytics team needs to work with the company's data effectively. merXu built their Big Data platform on GCP, using services like BigQuery, PubSub, Dataflow or GCS. Datumo’s role was to stabilize the existing implementation, replace some solutions with well-fitted GCP services and provide generic pipelines for future needs. Apache Airflow was deployed to orchestrate all the processes. The introduced tools and improvements allowed the set up of complex pipelines, with automatic backups and extensive monitoring.
IoT
IoT
The Client is one of the biggest pharmaceutical companies in the world. Our project aims to transfer data from a local factory environment to the Cloud, as well as processing and visualizing it. The solution's main technical stack is Kafka, Databricks and Power BI. Microsoft Azure Cloud tools are used to enhance or replace current pipeline elements. Machine learning, environment management and database architecture are also involved. The solution is efficient, reliable and easily scalable to a larger number of factories. It allows the on-site IoT engineers to quickly react to possible PLC drivers misconfigurations and therefore minimize waste and optimize production costs.
Telecommunications
Telecommunications
The Client is a major provider of streaming services to Southern Asia. They needed a clickstream analytics platform to monitor user behavior and experience. Datumo migrated previous Vertica-based solution, expanded analytics and added machine learning, providing recommendations. The platform was delivered on-premise and is easily scalable thanks to the use of open-sourced technologies i.e. Apache Kafka, Apache Spark and Apache Druid. The provided analytics, along with a pleasant GUI in the form of graphs and reports, resulted in better decision-making through content allocation for faster downloads, and improved user experience (e.g. recommendations perfectly tailored to viewers).
Banking
Banking
The Client wishes to create a centralised data platform to ensure smooth data flow and make data accessible for various teams. The key factors include data integrity, compliance, accessibility and watertight security. Datumo focuses on implementing and improving Hadoop-based solutions, collecting data from various systems. Great emphasis was placed on the quality of the code through a detailed code review and test coverage of at least 90%. Datumo’s team introduced many improvements: code refactoring, update of deployment scripts, reduction of execution time and introduction of new features of applied technologies.
Real Estate
Real Estate
The Client is a multinational real-estate company that handles global supply and demand of office space. The existing data processing infrastructure is built using a combination of Google Cloud Platform technologies, where BigQuery is leveraged to efficiently process available data. Datumo assists the Client in building a more robust and efficient system by introducing cutting-edge technologies and teaching best development practices. With our support, the Client has been able to replace their existing manual data processing pipelines with a state-of-the-art automatic orchestration system built using Cloud Composer and Terraform.
Insurance
Insurance
Our Client, a company investing capital generated by insurance services, uses a Big Data platform built in Azure, with Databricks as the main service to run complex ETL and reporting processes. Dozens of Spark jobs are automated to execute daily, producing complex data transformations. Moreover, the Client frequently needs new features in existing pipelines. Such continuous growth is a huge challenge for the development team. Datumo experts ensure that both maintained and added tools are implemented in an optimized way and utilize the platform possibilities. We use our Big Data experience to maximize the platform's efficiency and optimize its costs. Introducing best practices and generic solutions prepares the development team for future challenges and tailors to all Client requirements.
Make BIG data breakthrough!
Send us your inquiry via the contact form. We will contact you and together we will discuss the proposed actions for your data.