Full-Stack Software Engineer with focus on TypeScript/JavaScript ecosystem. Capable of full-cycle of the software delivery.
Iโve produced Web Apps, Services, LLM integrations, ETL pipelines, Infrastructure modules, engineering and business tooling, Admin UIs, physics-based visualizations, custom DSLs, Voice-controlled UIs, AR 3D apps and much more.
Zoom. Pan. Tap.
Check outthis scene livein myCodePen
I was born in Mogilev, Belarus, moved to Vitebsk by the age of one, lived there until i was six and then spent 10 years in Dribin to later pursue my education and career in Minsk.
I had to move to Poland in 2021, where I currently reside.
Living on my own since the age of 16, spent good part of my childhood working on our family's farm ๐จโ๐พ.
As soon as my family got our first computer, I've spent countless hours messing with graphic and audio design software and tweaking the OS to my liking. Also, I've spent A LOT of time in video games ๐ฎ.
One of rare photos from my last school years where I'm not doing some thing I would later regret.
Being honest, I wasn't the most diligent student during my school years, yet I never had any issues studying and getting through exams.
I've graduated high school with average grade of 8.6 (out of 10) and a huge pile of drawings made during some of the lessons.
I'm infinitely grateful to the tutors ofBRU Lyceumin Mogilev, Belarus for the urge to learn they managed to develop in me and other students.
I was very excited to learn basics of programming in Pascal and creating digital content with Macromedia Flash back in the days.
One of the classrooms where we studied electronic circuits.
I've applied to industrial Robotics, because I was hoping to obtain a unique and demanded skill set for the Industry 4.0.
As three years passed, we were taught programming in BASIC and learned the mechanics of the industrial robots developed in the USSR in the 80s. Lack of up-to-date knowledge was frustrating.
My education in the university ended right where my Software Engineer career has started ๐งโ๐ป.
During one of the exams sessions in the BNTU, after passing all the tests, I had two weeks to do whatever I wanted.
I've spent them building multiple small apps with pure JavaScript: A Minesweeper game, Step Sequencer for making simple beats and a Color Palette Generation tool. These became my first portfolio. I've got my first Software Engineer job just within a month of making these.
You can also check outthis Minesweeper live. Please, keep in mind that this was written even before my Junior days.
Six years later, I'm still as excited about programming and solving problems as I was back in the days.
I've fixed my first bug with an incorrect layout of a slide panel during my very first official day of work โ๏ธ. It was amazing.
I will never forget the feeling of how little do I actually know. Yet it was so great to learn new things literally all the time.
I've also continued buildingsome small things just for fun.
Luckily, I got up to speed quickly and became useful for the project and the team.
At the end of the first year, I was already capable delivering significant features on my own, directly interacting with Product Owners and the rest of organisation. I've also became a "Code Champion" in multiple areas of the Project I was working on.
To recognize my contribution, I was awarded with so long desired "Middle" badge.
I was feeling significantly more comfortable with higher level abstractions and overall application architecture.
As any other JavaScript developer at such stage, I've attempted to build my own frontend framework. In the end it turned out to be very similar to ExtJS, so I've abandoned it.
Render cycle from that OLD endeavour of mine. You can check out the whole root component classin this gist. Please, keep in mind that it was written before React or even AngularJS became popular and JQuery ruled the minds of "Web Designers".
I've joined a new company and new project in a position of a Middle Software Engineer.
Changing familiar toolset and organisation was hard at first, but I was excited about new possibilities, so adapted quickly.
One of interestingside-projects I've built at that time. It's a D3-based graph of Webpack build stats.
I was given a lot of autonomy right from the beginning and used it to propose new areas for the project to explore.
I was promoted to a Senior developer upon developing multiple projects on my own, some of which were crucial for the future of the Company.
One of the CodePens I've built during this period.It simulates the flocking behavior of the birds.
The definition of seniority is quite a questionable topic and varies greatly across the industry. In my opinion, seniority is defined by two properties:
My first project was a very large CRM, focused on streamlining interaction with customers of various businesses.
CRM was made of a plethora of interesting tools, allowing to create digital content fitted per each individual customer of a particular business and distribute it in ways ranging from an SMS to Postcards.
Some of the components I worked on.
It was featuring such components as full-featured Text Editor (in 2014, before Google Docs were even on the horizon), WYSIWYG editor for HTML-powered content, powerful lead tracking tool, project tracking software, cloud storage explorer and much more.
I've also managed to put in use my Graphic Design skills to deliver some features autonomously.
I was extremely lucky to get my first job to work on such a well architected software in a team of people with tens of years of experience.
Alongside with the main project, I've participated in building Google Sheets-like web application, for insurance in banking sector.
It featured infinite grid of cells with custom content and custom formulas evaluation.
Unfortunately, project was never used in production.
Upon closing of the previous project, I've participated in building a solution for managing and controlling assets in the field of water management.
Software allowed provisioning, monitoring and control of smart devices both for industrial and private-owned facilities.
In my free time I've also built a helper desktop app to manage and test connected devices.
This project became my first fullstack experience in production.
I was lucky to join a project with a vast scope and impact on the real world - a SaaS platform for IoT devices.
Initially, it wasn't focused on any particular industry area, allowing any smart device with access to the Internet to persist its data and communicate in real-time with other such devices.
Platform itself was extremely feature-rich, including SDKs for building device firmware and client applications, highly complex rule engine to process incoming events, abstract data model allowing to represent anything from a T-Shirt to a Smart Water Heater, rich analytics over collected data, geolocation for incoming events, image recognition, real-time pub/sub for most of the entities and events, fully customizable dashboards (akin to AWS Cloud Watch) and much much more.
During the years I worked with most of the components in the Platform, applying them for vastly different use-cases, ranging from a simple page for a digitized product to a fully-fledged Warehouse Management and Analytics built on top of the main Platform.
The last day in the Company is something I still warmly remember.
Later, company made an accent on the Tagged Products use-cases, collaborating with world-leading brands in digitizing their products and their lifecycle in the supply chain.
For my onboarding to the Platform, I was given a task to create a sample client application for interacting with Platform APIs.
The application allowed Users to observe and control their devices connected to the Platform.
A screenshot of one of the app pages. It featured real-time updates and a toggle for switching to a local IoT Hub instead of Cloud APIs.
Alongside the application itself, I've proposed an implementation for a UI kit to quickly assemble such applications in the future.
Sample code allowing quick assembly of a page for working with a certain IoT device via Platform APIs.
I've implemented a complete revamp of emails delivered by the Platform, migrating from text-only to a consistently styled HTML.
Amongst the features it included a component system for quickly composing new types of emails and full support for older email clients, like Outlook.
Reusable email button, for example.
As our application featured leaflet-based maps in multiple places, we've decided to unify the developer experience and wrap commonly used features in an easy-to-use package.
Main focus of the abstraction was on the programmatic control of the component, allowing to display a dynamic map legend, load/unload markers with clusters and drawing custom polygons over the map.
Amongst interesting features was automatic data clustering on the backend, to avoid transfers of colossal amounts of data for continent-level zooms.
Another aspect was to implement desired default behavior with little or no additional configuration for basic use-cases.
The platform has been providing a quite uniform experience for most entities, consisting of multiple reusable blocks of behavior, such as tagging, storing custom data, assigning custom identifiers, and scoping resources to a specific project within the account.
When adding another resource type, we opted to unify our developer experience by building CRUD pages for such resources too.
This approach allowed us to reduce the delivery time for such basic pages from weeks to days for each new resource.
The experience includes behaviors such as listing entities, filtering them by available fields, creating new entities, and editing existing ones, all built with the same exact reusable components.
The platform offered a wide range of APIs, necessitating frequent triggers for test sequences or providing additional control over entities beyond the basic CRUD model.
A screenshot of one of the tool's pages, designed to dispatch custom actions on specific resources.
We've developed an API client similar to Postman, enabling users to perform requests and observe the outcomes.
As Platform aggregated huge amounts of data, we faced the need to expose the most typical KPIs and Metrics for accounts of our users.
All the widgets positions/layouts and their contents were fully customizable for a particular Account needs.
It was approached by building a set of reusable visualisation components which could be connected to our Analytics API. Components allowed tweaking of the metric dimensions and content to the need of a particular user.
As Platform featured dozens of resources and APIs, not all of them were utilised in each and every solution. Alongside with that, some solution required restricting the content which users can access based on their Role in the Account.
We approached the problem by implementing a fine-grained Role and Attribute based access control for the application.
We've implemented a Role and Attributed based access control accross the whole application, allowing to control which resources User with a certain Role can access and which Pages they could reach.
It allowed an Administrator of the account to define which resources and pages are available for certain types of the users in their respective solution.
For better support of different use-cases on top of the platform, I've implemented a system for an Account Administrator to define a timeout for an active session in their Account.
Alongside with custom duration, it supported active session maintenance, actively logging out users after a period of inactivity, as some of our Clients required sessions as short as 10 minutes.
I've implemented a neat feature providing real-time feedback to the User on meeting the password complexity criteria.
Small recording of a component in action.
I've managed to find a neat way to integrate these dynamic visuals with the form framework we used.
As our codebase grown, style builds became increasingly slow, reacing a duration of 40-50 seconds after each change. As Compass ecosystem didn't allow us to scale further, we've made a decision to drop it in favor of a libsass implementation.
I've implemented a replacement pipeline compiling existing SCSS with libsass. It helped to reduce build times to a range of 1-2 seconds and made possible much quicker iteration times on the application development.
As we were maintaining an Application with its own backend, we were maintaining a test suite for its APIs, on top of the Platform APIs.
I've implemented a meta-framework to make implementing backend tests less painful.
Such an approach enabled us to bootsrap new test suite with a single import containing all the necessary tools preconfigured and with application server running in a production-like environment and ready to accept requests. It also implemented mocks for main Platform APIs for tests to be run in a complete isolation either on CI/CD instances or on local machine.
As solutions built on top of the Platform grew increasingly different, there was a need to provide a deep customization of the content in the Client's Account.
A screencast of editing one of the Custom Dashboards
I've implemented a grid-based dashboards framework allowing Users to create new and edit existing application pages with ability to place pre-defined or fully custom widgets. The Framework also supported export of configured dashboards for sharing between the Accounts.
It also allowed embedding third-party widgets deployed and hosted outside of main application, allowing our Client's Engineering teams to extend the Application to their needs.
We were approached Client operating a fleet of smart devices connected to the Cloud with a request to migrate their assets to our Platform.
I've participated in building a solution allowing to communicate with devices in custom binary protocol.
Alongside with communication, we have implemented a set of tools to deploy new device firmware over the air.
I've also implemented a distributed tracing of AWS Lambda invocations orchestrating internals of each run and providing us with an insight for optimization and debugging.
As we required to manage a variety of different devices with a lot of business rules, I've also implemented a custom scripting engine allowing to modify device state based on change of its properties.
Alongside implementing the solution for managing the device fleet on top of the Platform, I've also implemented a set of custom extensions, allowing managing supported device types, managing device firmware, performing testing of device response and providing an overview for device state.
A screencast highlighting some of the features allowing to manage some of the properties of how devices are connected to the cloud and some additional firmware metadata.
During my work on IoT gateway on AWS Lambda, we were handed off one of the legacy components of the Client's system, a microservice written in Dart, containing a lot of crucial business logic.
We've bridged given service with Platform APIs, so that it's consumers could still use older Gateway APIs, whilst gateway itself would have our Platform as backend.
To provide better interop between existing codebase and Platform APIs, I've written a Dart SDK for the Platform.
In addition to classic Request/Response workflow, it features support for streaming, implemented with Dart Streams, allowing to easily perform bulk operations.
Example of streams API in action.
As Solution grew, we faced the necessity to manage dozens of sequential and parallel invocations of AWS Lambda related to certain device workflows.
To overcome the complexity of debugging such a solution, I've implemented a system for tracing the events happening within our Lambdas, collecting and aggregating them for display in our Platform.
One of the traces we recorded. Tool allowed zooming and panning over traces and reviewing additional metadata available on the events.
As addition to that, I've written an article (RU) on creating such tracers and released a stripped-down version of the tracer. It's compatible with chrome://tracing
protocol and could give you an insight of HTTP calls issued when your code is running with just a one-line change.
Having a lot of experience with most of the Platform components, I was promoted to take a position of a Frontend Lead.
I've curated frontend-related meetings and initiatives, was conducting interviews for Frontend and Backend positions, helped to investigate any Platform outages, maintain and monitor Platform infrastructure.
As we have maintained a mature and constantly growing application, we were trying to keep the balance between the legacy burden and keeping our technology stack up-to-date.
After certain stage, we faced the need to implement better code splitting techniques than we had employed at the time. We've conducted an architecture council to determine a way forward.
Over one of the weekends, I've concluded a technical spike proving a possibility to migrate our codebase to a Webpack, alongside with introduction of better code splitting and lazy-loading techniques.
Shortly after, we've used our Tech Debt quota to implement the changes.
One of the most interesting features of a migration was that we managed to avoid refactoring of the most of the codebase by adjusting Webpack to work with multiple different module formats employed during the Application life.
We were approached by a world-known Company to help them to reduce operation times for some of their internal workflows related to the Traceability of the Goods used for their products.
To tackle that, we've built a custom solution on top of the Platform, aggregating data from multiple Client's systems and allowing to reduce their workflow time from weeks to just seconds.
Solution was based on AWS Lambda, used as extension point for the Platform to hold a custom logic for processing and querying Client's data.
On top of data, I've built a whole range of various components enabling to surface the information based on the Client's terminology and implementing custom visualisations to better reflect its states and conditions.
As one of the solutions I've previously developed entered another active phase, I've actively participated in implementing new features and maintaining it.
Amongst the interesting problems we were solving: analysis and aggregation of 250Gb of XML data, programmatically building custom XLSX reports.
As our Customers were maintaining the technology stacks of their own, we needed a mechanism for better integration with the Platform in addition to the existing customization framework.
I've implemented a way to serve multiple completely different frontends from a CDN based on Customer's Account configuration. It enabled us to build highly specialized applications for one of the solutions we maintained. It also enabled us to safely explore new technology stacks if needed without affecting the main functionality and existing solutions.
We've seamlessly integrated the Satellites concept into main workflows in the Application allowing RBAC/ABAC and switching between different satellites as well as linking to a particular chosen experience.
I've also researched a capability to embed these satellite applications directly into the main Applicaiton, essentially implementing microfrontends methodology.
At some point, it was decided that the capabilities of the Platform are too hard to grasp on a high level and business was split into three different aspects to better highlight each of the potential use-cases for the Platform.
To provide better support for these different business lines, it was decided to build multiple standalone experiences suited for each particular business line, unified by a single visual style and underlying UI kit.
We've approached the problem by implementing a capability to serve completely different experiences as part of our main Application. It was implemented as a monorepo of multiple applications capable of sharing any required codebase parts, yet bundled into a completely standalone artifacts using the same exact APIs.
As our Applicaiton grew, we maintained a set of End-To-End tests around critical user flows.
At some stage, during build pipeline optimisation, we've decided to split main Application codebase and codebase of tests alongside with covering new user flows.
As we had plans to enable different technology stacks in addition to existing one, during the split, I've dropped Protractor and replaced it with TestCafe, building a metaframework to make writing e2e tests less painful.
The core sense of Metaframework was to cover basic application interactions, such as Login/Logout, change of selected Project, working with unified CRUD pages, so that instead of relying on instructions set tied to particular elements it would be possible to orchestrate them at a higher level, in terminology of the Application itself.
Sample test using the metaframework.
Another interesting feature was environment setup. I've managed to build a system which was capable to be run on any of our environments without additional configuration steps.
Alongside with main Product development we were always looking for ways to streamline and simplify developer experience, making our work less tedious.
One of the legacy decisions resulted ot a huge size of Application's Docker image - it was over 2Gb.
In the process of optimisation I've reduced the size of the image to 136Mb and migrated it to multi-stage builds, to make it possible to use same exact Dockerfile for running Application on local machine, on CI/CD Instances and in Production.
One of the interesting features implemented was pruning of development dependencies for production stage.
As Platform offered very flexible extension points, we faced the need to extend their behavior to support more scenarios.
I've architected and prototyped a service based on AWS Lambda allowing Platform Clients to create and execute arbitrary scripts.
One of the interesting features was NPM-based build pipeline, so any client script could be installed in any way NPM supports, such as: directly from git (incl. private repos), from tarball package or from the registry itself.
An example of using prototyped service by creating a new User Script from private repo and then querying/modifying it.
As we offered interesting tooling for quickly deploying custom Product pages to the Platform (also called Experiences), we were interested in exposing better tooling for it to be accessible by non-technical Users.
I've architected and prototyped a Service for creating and deploying such experiences.
Prototype also included WSISWYG Experience Editor embeddable into our main Application.
One of the early mocks I've produced for the Editor.
Prototype of GrapeJS integration and QR Code to access deployed experience.
During my late days in the Company I have switched to a part-time role to support one of the solutions I've previously built.
I was working with ETL and data ingestion pipeline to optimise the quality of produced data and cover discovered use-cases.
Hiring is hard for tech companies. As an initiative to improve Company's public image on local tech stage, we've conducted a mini-conference, talking about different technical and social aspects of working there.
To showcase some of the interesting technologies we've got our hands on, I've prepared a talk and built a demo app allowing real-time interaction between multiple people, based on one of the Platform features.
Demo of AR experience in action. If multiple people scan the QR code, they would be collaborating on this voxel playground in real-time. You can check out code for the experience here.
As we operated a very large set of features, we faced a need to showcase the capabilities to our Users during onboaring.
I've implemented a highly integrated experience covering major Platform capabilities.
A portion of the flow. You may see in the video that when creating a Product, actual experience is also created and available immediately. This demo leaves aside a feature of auto-generating Product instances, which was also a very interesting topic, as you could generate thousands of containers for your digitized goods right from the start with the Platform.
Alongside with the Quick Start we've delivered a new feature allowing to easily create custom experiences associated with their Digitized Products. You can check out the starter kit on the GitHub.
We were approached by a large company looking to digitize their supply chain akin to what was made for some of our previous clients.
After a throughout investigation, we have implemented a custom solution based on current standards of Supply Chain industry.
Solution allowed our Client to track and trace shipments of their Product across the continents with ability to see the distinction between expected and actual quantities to analyse any potential anomalies.
One of the cool features I've delivered, was a Sankey diagram highlighting recorded transfers of Product throughout the supply chain.
Example of a relatively simple item transfers. Overall state was restored from a sequence of events recorded for each item individually.
As I moved to part-time position, I've used remaining time to pursue my interests in game development.
The first day I tried Flutter - I immediately fell in love with it. A very powerful set of abstractions, extreme composability and extremely productive developer cycle left me hugely impressed.
After fiddling with Flutter itself, I've focused on its rendering capabilities, which are somewhat similar to what HTML5 Canvas provides. As soon as I've got comfortable, I've started implementing my first attempt of a game.
The first game was an idle progression game about humanity's history and technology. I was also planning to build up some basic plot around technological singularity and AI already being in power. The main game loop was to earn progress points to unlock new technologies to earn more points.
A short recording of "Progress" game onboarding and main screens.
After significant amount of time spent, I've realised that to properly finish the Game I needed around 7-8 months of work on content only. Unfortunately that seemed as a bit of an overkill for something which might not work at all. So I switched my focus to a simpler game, with focus on procedural content.
I've implemented a lot of interesting features into the game, such as a force graph simulation and reactive inventory system.
When working on my first game, I've realised that it would require an immense effort in order to produce enough content for it. So I've focused on another, simpler concept.
Second game even got its name and a website: "My Long Locking Story". I have even started the promotion activities by trying to get traction in Game Dev community in Twitter and on Reddit.
All the level types of My Long Locking Story together.
— My Long Locking Story (MLLS) ๐ (@mlls_hq) December 28, 2019
Rather chaotic and quite stressing experience, even without the need to pick the lock!#mylonglockingstory #indiedev #games #gamedev #flutterdev #screenshotsaturday #screenshotsunday #puzzles pic.twitter.com/QX9I4fujjV
Unfortunately, as it often happens with first-time founders, I've burned out due to endless work cycle and constant stress because of being "out of time".
This game features plenty of interesting code, such as a simple fluid simulation, a lot of force-based simulations, simple scripting runtime, dynamic dialog system, a level progression system with reactive unlockables, reputation system and much more.
One of the first levels of "My Long Locking Story"
During the development of "My Long Locking Story" I faced the need to decouple game scene and level generation from Dart code itself. The reason was simple, Flutter can't execute Dart code dynamically, it can only run what was compiled into the application bundle.
So, to empower myself with greater flexibility, for potential future uses and to simplify existing level development flow, I've explored various scripting technologies. LISP immediately caught my attention because of how easy it was to understand the core concepts and how well it was covered.
I've written a simple LISP scripting engine connected to my game engine to control game scenes, their content and event flow between components.
Example of evaluating simple LISP code with that interpreter.
After an amazing 4 years at my previous job I was ready to try to do something of my own. Through the years I was collecting ideas which occasionally happened to appear in my mind, so I've just scrapped through those and picked one which seemed the most promising.
What if you could subscribe to any value on the Internet? Could be pretty handy!
No more daily visits to a range of bookmarks, no more problems with looking for an absent RSS feed.
Excited about possible applications, I've started with building a PoC in about a week, which allowed me to track local prices on videocards.
Small demo of Wutch in action.
Right now this service is in indefinite Closed Beta (ping me at av@av.codes if you're interested to participate), and I've put development on hold until the situation in my country will become less tense.
I was always fascinated with Software. I've fiddled with all the programms I could get my hands on.
Once, I've got my hands on a simple sample editor and found it extremely entertaining to distort recordings of my own voice.
Long story short, a year after that, I've attempted to write music with a computer. Initially, it was as bad as you think it was (maybe even worse). Luckily, those first recordings are stored safely on a Hard Drive, I won't tell a single soul stored where.
After many years of making really bad things, quantity transitioned into some degree of quality. You may find recordings I'm least ashamed of at my SoundCloud account.
You're viewing a CV of a Software Engineer, so consider yourself warned :)
I loved drawing, had bearable skills working with Graphic Design software and had lots of long lonely evenings after studying.
I've created and curated a Typography community in one of the social networks popular in my country.
After that it was a very short run from considering myself a Graphic Designer and applying for projects on Freelance job boards.
Doing that, I have produced a lot of things which are still hanging on the walls at my place.
These are literally on the wall of the room I'm sitting in right now.
To check out more, you may visit my old (and dusty) portfolio.
At some stage, me and my Mother moved to other village and were setting up a new life for ourselves.
I couldn't say it was boring, but I certainly had less things to do than before.
So, somehow my Mom got me into a remote course on so called "Web Design", where they taught me the difference between unordered and ordered lists and all the powers of table layouts.
Writing that HTML on the pages of notebook to be sent via post office back to the tutors, I could never guess that would be related to my career.
As I've spent countless hours drawing on lessons in School and I loved computers and everything related, it was only a question of time - when I get my hands on some Graphic Design software.
Initially, it was a hot mess from custom brushes and gradients, which I thought looks cool. A bit later I even managed to do some basic design for my friends and family, creating them business cards and coasters for social media groups.
Later in the years, it helped me to try myself in the world of Graphic Design Freelance, I even got a couple of actually paid gigs out of that.
As my excitement with Flutter grew, I was diving quite deep into the framework and its capabilities.
At some point, I decided to share my finding with the rest of such an amazing community.
I've released multiple articles on different topics:
A quite magnificent article by Ivan Cherepanov on Flutter, composition, and particle effects. Both informative and a cracking read. Great job! #Flutter โฆ@FlutterCommโฉhttps://t.co/7xtIMJJyAW
— Tim Sneath (@timsneath) October 28, 2019
I was extremely proud, when one of the articles was recognized by a Flutter Product Manager.
Working on my own mobile game, I faced the need to quickly implement 2d particle effects to be used in a variety of scenarios.
After implementing Particle System and sharing it with the engine community I've got a greenlight for implementing it into the engine.
Examples of Particle System in use.
You can check out the contribution and docs for more details.
I've participated in challenge from Google Flutter team to create a new and intriguing clock-face for an android-based smart device.
Just finished my submission to #FlutterClock. Called it "timeline", cause it's a line... showing time...#Flutter #FlutterDev #DartLang #FlutterInteract #UI #uxdesign pic.twitter.com/P03AmBp5Za
— My Long Locking Story (MLLS) ๐ (@mlls_hq) January 19, 2020
You can also find my contribution on the GitHub
As any programmer, I was mesmerised by how programming languages are built.
Speaking Russian natively I was also curious on how the code is perceived by the native Egnlish speakers.
So, during one of the weekends, I've got my hands on the Esprima, ECMAScript parser and patched it to support arbitrary sets of keywords, instead of hardcoded English ones.
After that, I've created a small lib to to traverse the AST and swap the keywords to anyone's liking.
As a result, I was able to create a small transpiler for a "RedScript", JS subset with all the keywords being in Russian.
I've also written an article (RU) on using the module.
During my first months in the Company, I've experienced day-to-day need to generate large amounts of semi-randomised data to test and verify various behaviors of the Platform.
To help myself (and my colleagues, facing the same need), I've written a small template-based data-generation tool, allowing to steamroll semi-random data for any partiuclar needs.
I was relieved from { "name": "test", "description": "test" }
with DTG.
At some point I was attracted to Rust.
In order to get my hands on it, I've decided to start small and try solving Project Euler puzzles with it.
You can check out my attempts in this repo.
After being tired of typing endless {
s and "
s, I've tried my skills in building a specification for a data language (like JSON or YML).
I've developed a spec in a set of railroad diagrams (to build a parser upon) for a new (as I thought) data transferring language, only to realise that someone infinitely smarter than me already created YAML, following a very similar purpose.
In the project logo, I've tried to highlight that YSON is like JSON, but without unnecessary parts.
During my work I was issuing hundreds of various calls to Platform APIs.
Taking into account my curiousity in how Programming Languages are built, I've explored the territory of parsing a plain english text and transforming it into a series of HTTP calls to query Platform APIs.
I've created a grammar using Nearley toolkit and integrated it with the Platform SDK to perform queries like:
One of the cool features was that it could perform filtering much deeper than allowed by the main Platform APIs.
As a result of one of the projects, I've released a JavaScript library allowing to orchestrate arbitrary code and record its behavior in the runtime.
I've also written an article (RU) about implementing simple tracers compatible with chrome://tracing
.
I've joined a startup focusing on the Business Intelligence space. The company created a platform for collecting and processing data on businesses and their activities.
We served over a million company profiles, providing our Users access to more than 400 data points for each.
During my time at the company, we significantly expanded our product line, adding solutions for monitoring, collaboration, risk assessment, and more.
I'm proud of all the work we've done, and I'm happy to have been a part of it.
My contribution ranged even more than on previous positions, from the DB optimisations to the Product/Design work, encompassing the whole flow of delivering the product to the end-user.
My journey as a Software Engineer at Craft has been a mix of technical challenges and triumphs, spanning various aspects of the platform's development.
I've contributed to a wide range of projects, from infrastructure management to UI/UX enhancements, always focused on delivering high-quality, performant features to our users.
Harbor is a tool to effortlessly run LLM backends, APIs, Frontends and services with one concise CLI.
It started as a local setup, and I had a change to significantly expand upon it in-between the jobs.
Harbor high-level overview
Harbor gathered some positive feedback from the community and I had a chance to meet many interesting people along the way.