Our services

We help you achieve your business goals by using a unique approach to building software and advanced programming techniques.

Architecture Review

We provide a deep analysis of your current solution architecture. Together, we find solutions to tough technical debt and performance issues to deliver maximum business impact.

System Audit

We review your code and provide innovative solutions. Using our cutting-edge techniques, we ensure your code shines.

Performance Review

We benchmark and analyze performance component by component. Our comprehensive report helps you find cost effective ways to make your product fly.

Innovative Solutions

Use cutting edge knowledge and tooling to deliver a highly efficient software based on customer requirements and time frame.

Modernize Legacy

Augment existing solutions with new capabilities or replace critical components with better performing ones when full rewrite is too costly.

Proof of Concept

Boost your time to market by proving ideas with solid prototypes. Committing to simple and powerful design that can be turned into production quality code without the need for rewrite.

Our favourite tools

type level
scala
zio

Portfolio

Some of our projects we have built with our favorite language, Scala, using the functional programming paradigm.

Empowering domain experts to test and execute business logic from specifications


With the addition of business oriented language, we eliminated costly development cycles and reduced time-to-production of new features from weeks to hours.

We had to solve a problem of long development cycle. Technical specifications for extracting data from equipment status reports were created by domain experts and had to go through a full development process before being available for validation and deployment.

The amount of new features and changes was significant enough to occupy a team of developers, preventing the project from scaling.

We have introduced a custom programming language to describe data extractions with syntax closely followed the way specifications were originally written. Plus we've added necessary interpretation and execution capabilities to support internal language in the product.

This allowed domain experts to express data extraction logic directly as programs, ready for testing and execution immediately, completely bypassing the long and expensive development cycle.


Reducing troubleshooting from days to hours for a major mobile carrier


By building customizable data pipelines and leveraging machine learning models, we were able to reduce the troubleshooting and analysis of the failed network equipment tests from days to hours for one of the largest telecom providers in North America.

This goal was to accelerate deployment of new network services. Every new service requires significant amount of lab testing of telecommunication equipment. The problem had to do with analysis of failures relied a lot on manual processing.

The application we built had to ingest large binary files with test results and expose two APIs for end-users to script the test scenario: a streaming API to describe operations on all input data and packet API to describe operations on network packet. The APIs were exposed as custom compiled script language with compiler responsible for verifying syntax, types and termination of programs which replaced manual effort of searching for patterns in test results and extracting necessary data. The extracted data was further going through a machine learning model for initial verdict, thus leaving just verification of results to human testers.


Increasing quality and reliability of data preparation for ETL and analytics


We've built a highly scalable streaming engine to automate the manual, ad hoc, and repetitive data processing and ETL tasks that brought significant reduction of support costs of existing ETL solutions.

Every new source of data connected to existing software requires additional effort of unifying formats, removing irrelevant parts, joining with existing datasets, etc in order to make external data compatible with internal model. The problem was the amount of manual and ad-hoc work on data preparation performed by development teams that operate data-driven solution.

We have implemented a series of DSLs to formally represent operations on data as acyclic graph of steps needed to get from initial state to the desired one. Combined with visual tools and compiled into runnable stream executed by horizontally scalable engine, this approach reduced the need for ad-hoc development to a bare minimum, allowing data analysts and integrators to build, test and run pipelines on their own.


Collecting sensor data and making it auditable and traceable


We've created a platform to connect wearable devices and phones to track and visualize real-time biometric data on a timeline and build historical reports for analytics.

"OmRun", a next generation running platform, which helps runners track all the data they may ever want by using a sensor-equipped sport bra and a phone. Our focus was to handle incoming signals in real time and apply machine learning algorithms to biometric data.

We designed and developed software modules that allowed data to move from wearable devices to backend systems in the cloud making an interim stop on the user phone. To solve the problem of displaying meaningful running metrics all data had to be eventually arranged on a timeline. We used combination of event sourcing and finite state machines to ensure overall correctness during the transfer.


Another use case? Let's talk!