|Web Crawler||Java, Spring Boot, AWS, Crawler4J, and Vue JS|
|Methodology, Tools||Kanban, Lean, CI/CD, Trunk base development, Feature Toggling, Champagne Brunch, Automated Testing, DevOps, Immutable Infrastructure, and Infrastructure as a code|
|Website Redesign||AWS, Serverless, Vue JS, Nuxt, Java, Spring Boot, Vavr, Node JS, and Express|
We joined forces with an American financial services company with nearly 7,000 employees that has been empowering investor success since 1984, a major player in the asset management industry and as of March 31 manages over $200 billion with $1 billion in revenue. The three main regions of operation are the United States of America, China, and India.
- Investors First
- Great Products
- Great People
- Uncompromising Ethics
- Entrepreneurial Spirit
- Financial Success
This company serves investors, from a single person to a big corporation, with information on companies in order to help evaluate the investment potential. To present the latest complete information and forecasts to their clients, the financial services company collects financial updates by leveraging web crawlers and private sources, and after that analyzes and draws its own conclusions.
Today it is a giant corporation that grew by acquiring and merging with smaller companies. Every acquired sub-company works independently and represents slightly different directions called “business lines.” Our teams helped with three separate business lines: Web Crawler, Data Collection, and Website Redesign.
The financial services company needed help with data integration and our teams stepped in to take over this goal. For almost a year, a team of six experts worked on this integration project — which was successfully implemented. Then, we moved on to cover other business objectives connected to financial services software development.
The financial services company needed to get more accurate financial information for their clients. We helped them with this global mission by developing the following projects:
- Web Crawler – is an investment research software that is able to collect information from websites. This additional source of data would improve overall awareness of investment information for the financial services company. Business Analysts would later process the collected data and draw conclusions from it.
- Data Collection – this project was about gathering information on the Initial Public Offerings (IPOs) of companies. The financial services company needed an application that was able to quickly gather and process certain financial data as well as provide an analysis.
- Website Redesign – the website needed a complete redesign and additional features from their modern products.
- Web crawling is a complex process with a lot of tiny nuances such as the different designs of websites, various types of data sources, and the interaction between separate data sources on a single website. There was also a challenge in the development process because some front-end infrastructure and testing goals were set, but we only had Back-end Engineers.
- Data Collection challenges include the limited size of the team, strict deadlines, and the team had not performed this precise work in the past.
- It was required to work with AWS
- Our team was restricted to use only VueJS and Angular, and we couldn’t leverage our React expertise in this case
To complete all projects, we eventually expanded our team to 15 experts. Here are some details about our management approach:
The distinctive feature of the ways we deal with challenges is craftsmanship, also known as the engineering approach. We have a team of Back-end Developers that could take over back-end, front-end, infrastructure, testing, and any other task that might come up. I believe that every expert on the team needs to be an engineer focused on solving business goals. It is very limiting when an expert focuses only on his specific area and can’t handle other technical challenges. My experience in startup and product companies taught me that it is crucial to the overall success of a project to have a team full of all-around engineers capable of understanding the vision and solving business problems.
– Igor Dmitriev, Engineering Manager
In regard to the Web Crawler, we started by using Regular Expression (Regex) scanning for keywords based on an open-source framework and leveraging Java and Spring Boot. But the Regex approach proved to be inefficient because keyword patterns were found out of context, resulting in a large amount of unusable information.
That’s why it was necessary to go with Machine Learning technology and its Natural Language Processing capabilities. Doc2Vec and Word2Vec models allowed us to determine a context for a piece of text and the resulting data became more usable, eliminating unwanted pieces and valueless processing.
Our team managed to complete all of the front-end tasks by using the Vue JS framework. We didn’t have any QA Engineers, so our back-end team learned how to conduct tests, including building a testing strategy.
The first move here was to include more experts in this project; we had a total of 10 members working on it. To meet the strict deadline and maintain the highest level of quality, we had to work 12-hour days for three weeks without any days off. The craftsmanship approach and dedication of each team member allowed us to pull this off. Unfortunately, we cannot disclose the full business details of this project, but we managed to live up to the highest expectations on this one. The main technologies we used were Spring Boot and Docker. Working with AWS was a requirement, and we managed to make the best of it. We built an infrastructure, adjusted monitoring, and implemented best practices into the processes.
This project was different from the other two because our expertise was provided via an outstaff model, meaning that five experts were integrated into the American development team. As mentioned earlier, we managed to hire highly-skilled front-end developers to complete an independent team on our side. There was a tech stack restriction, as only Vue JS and Angular were allowed to be used; we couldn’t use our React expertise. We followed and completed all of the tasks provided by the company’s management and proved the effectiveness of this engagement model for the business.
The Web Crawler’s Machine Learning model was trained on over 1,500 entries, and we also connected the solution to the AWS Cloud Computing Platform. After the completion of the project, its support was transferred to another development team. Our expert team built a finished product that added to the value and volume of data collected by the financial services company and saved the Business Analysts’ time.
What we are most proud of is that in the Data Collection project, we ran all business-related and development processes by ourselves. We had just one business expert from the client’s side and were able to manage every nuance as well as make all key decisions from the beginning to the very end. Our team became part of the corporate culture, communicated inside of this large organization, and delivered the product to the production phase. MVP was ready in two months in order to adhere to the strict deadline, releasing 30 tasks in a week.
Data Collection wasn’t the only project of this type working with IPOs among our client’s other teams. However, after launch, it became obvious that we had created the best solution for this task — surpassing software that was already being used. Thus, the financial services company decided to use Data Collection Tool as a template for all new similar solutions developed by the teams inside the organization. Ultimately, we created a set of unified tools that could be used to optimize the company’s internal business functions, speed, and productivity. We implemented a CI/CD with Spinnaker for deploying and made it a standard across the Data Collection teams.
With the Website Redesign, we successfully integrated our experts into the American team and earned their respect. In the beginning, they gave us only the development tasks — but later gave us more responsibility and freedom for design and technical implementation. The website was released just as planned and migration was completed in time. We continued our support of the website after release and tweaked some features later. The website is now stable and endures a high volume of traffic with hundreds of thousands of visitors.
SPD Group successfully executed on several software projects for us. All of their developers and management staff are highly talented and very professional. Working with SPD felt like we were working with an internal team. They are always accessible any time of the day and very flexible in providing support to our users who are globally distributed. The technical acumen of SPD developers has been top notch. SPD management have always been very responsive in hiring additional developers for us on short notice which helped us deliver quickly on several projects.
– Head of Technology
ARE YOU INTERESTED IN DEVELOPING A FINTECH SOLUTION?
Contact our experts to get a free consultation and time&budget estimate for your project.Contact Us