9 Lessons Learned from Building 60 Data Source Integrations
Gone are the days when developers had to code every aspect of their product from scratch. Today, a cacophony of databases and APIs exist with the explicit purpose of enabling developers to build upon existing frameworks and stacks. But like in any menagerie some birds squawk and squeal while others sing in perfect tune.
Over the past months our team at Panoply.io has implemented over 60 data source integrations to our platform. To achieve this, we developed a data extraction framework designed to handle different implementations of data sources that could easily integrate any future data sources with a few hours of coding. The foremost challenge we faced was how to make this layer robust enough to survive changes over time, as well as feature and version fragmentation without kicking off an endless maintenance spiral.