Why Fortune 500 companies are re-building their API integrations

Russell Crowe as John Nash in the movie 'A Beautiful Mind'

These days, companies are overflowing in data and tolerating ancient API-integrations. Across all industries.

Buried in that data are trends that point to customer insights, inefficiencies and synergy opportunities that could provide a competitive edge to a company. Or, just help a company catch back up to its competitive peers.

But how to best harness this data? Not everyone is like the genius John Nash in the movie 'A Beautiful Mind' that can look at raw data and see the hidden, correlated magic. After all, John Nash was an 8 sigma event.

Normal humans need help synthesising data

Normal humans can't make the leap from 3 different CSV files and 2 database tables to (statistically significant) insights.

Normal humans need help. The best technology leaders understand this - giving rise to the data-driven application arms race. These applications enable a data-driven culture to operate. A culture that can seize on readily available data insights and refine or disrupt traditional approaches. An organisational behaviour professor would refer to this as an 'informational edge enabling first-mover competitive advantage'. Military doctrine refers to this as simply 'tempo'.

We at DreamFactory have spent the last 7 years helping clients bring sanity to their data through building these data-driven applications, and have iterated a platform to help. How? Through offering a unified method of deploying API integrations fast, in-turn allowing data flow regulation at the API level to enable data-driven applications.

So why are Fortune 500 companies rebuilding APIs, exactly?

An API should empower the application developer to be successful, and facilitate end-customer needs according to associated privileges.

Does that sound like your API integrations? Our experience suggests that it isn't. In fact, 'key-man' risk may exist where only a few people in your company knows how to use or edit an API. This typically leaves entire integrations and database access at risk. Additionally, APIs tend to be built once, then wrapped in bubblewrap with lots of 'do not touch' post-it notes stuck everywhere.

The problem is when change is needed, the API can't change due to the time needed and risk posed. So the recurring constraint voiced by the backend team is 'we can't do that.' Essentially...

The business bends to the needs of the API.

This is what is driving the push in API integration rebuilds from our leading clients. Previous APIs exist, but they were built years ago in over-complicated methods that ultimately set up future Application Developers for failure.

They were not fit for enabling success in the digital age.

What is needed? Backend connectivity with continued flexibility. An Application Developer needs to be able to add, edit and subtract datasources from an application as business strategy changes.

Application Developers need their API integrations to bend to the ever changing needs of the business.

Hence, API rebuilding has become a first step on this path leading to the data-driven culture.

How long does rebuilding an API manually take?

Let me start by saying there is an easier solution further below.

But our research and consultation with our community pointed to a 'commercially viable API' taking up to 46 business days. This includes:

  • the design research;
  • the API build itself;
  • deployment;
  • documenting it;
  • securing it; and
  • successfully testing.

A 'complete API' we found to take up to 70 business days. This complete API additionally includes:

  • relevant roles mapped out;
  • Enforced authentication & authorization;
  • limits imposed as required;
  • logging implemented for audit and compliance purposes (GDPR); and
  • scripting for any automated processes needed.

With this feedback, we built an API calculator to help as a forecasting tool for manual API builds.

It becomes obvious quickly that a data-driven application that draws on 6 data sources through 6 APIs becomes a significant project to tackle.

A short example of API rebuilding gone mad

To highlight how bad this can be, one recent client DreamFactory served embarked on a mission to completely re-write 2,500 APIs. Why? Younger competition was stealing market share due to the ease of clients integrating with their simpler APIs. And this was banking - not exactly a fast moving, low barriers to entry industry. Prior to DreamFactory turning up, this API re-write had a project timeline of five years. The company were paying through the teeth to an army of premium external consultants to create ten APIs a week! And these were not standardized APIs that were being delivered.

DreamFactory scoped the project to be completed in 5 months by the client's internal team (no consultants needed) . This represented  a 93% project velocity boost and a central tenet to their rejuvenated digital go-to-market strategy. The team then had the internal tools to add and edit APIs as needed. Application developers were happy as they finally got something they could work with to address the strategic demands of management.

The smarter way to build APIs - using a framework

The below image captures the approach our company advocates, and how our clients rapidly iterate modern applications: using an API framework:

The following three 30 second GIFs show exactly how some of the world's leading companies are completing this first step. By avoiding the bottlenecks associated with API creation, significant time and risk is bypassed to establish the APIs necessary for Application Developers to be successful.

Step 1: (re)build the API

Just enter your database credentials and the platform will create a commercially viable REST API. 46 business days saved. This example demonstrates building an API integration for MySQL.

This may seem fast, but that's what comes from 5 years dedicated to making a 'Ruby on Rails' for APIs. Million of lines of code have been written already, so you don't have to write them yourself:

[embed]https://cl.ly/c58bcd36f5da[/embed]

The 'service created' green flag at the top left is telling you your commercially viable API has been created.

Step 2: Generate Swagger 2.0 documentation.

Actually, this is auto-generated within your new API using a best-practice standard, as per below. You can skip straight to step 3.

[embed]https://cl.ly/f0dd9b5e3536[/embed]

The auto-generated Swagger documentation provides another benefit aside from the obvious ease of use  -  being homogenized URLs. Thin means:

Every API has the same URL structure

Here is the common URL structure:

https://www.servername.com/api/v#/database/table?filter_parameters=abc&other_parameters=xyz

So, what does a query look like on an Oracle database as opposed to a MySQL database? Only the name of the database, provided they are of course on the same server:

Oracle: https://www.myserver.com/api/v2/oracleDB/contacts?filter=lastName%20like%20'jon%'&order=firstName

SQL Server: https://www.myserver.com/api/v2/mysql/contacts?filter=lastName%20like%20'jon%'&order=firstName

What does this mean? It means that the way your developers interact with MySQL is the same as SQL Server is the same as MongoDB is the same as Oracle... Meaning significant simplification for the Application Developer and end-users across an ever growing array of data sources.

Step 3: test queries to ensure the API is working

Lets see how easy it is to live test queries prior to building applications, setting the Application Developer up for success. Note that an employees table within the same MySQL API is queried here returning a 200 status code and the relevant data. No steps have been missed.

[embed]https://cl.ly/a7ac2d6f7633[/embed]

Step 4: create the roles needed for your team, or proxy into the API Manager of your choice, such as Apigee or Mulesoft

DreamFactory has an impressive and growing suite of API management tools; however, you are able to use your company's preferred manager if you like. It's a simple matter of proxying into the DreamFactory API via your preferred platform.

You can pick up the rest of the steps at guide.dreamfactory.com, or follow along with a longer walkthrough just below.

A live SQL Server REST API build, test & role creation

Here is the DreamFactory CTO, Jason Gilmore (and author of Easy Laravel 5), guiding you through the API creation process with SQL Server. This video covers the API generation process, documentation, API testing, role creation and using a 3rd party tool to access the API integration generated and prove functionality.

If you prefer a blog walkthrough with all the technical information, it's available here.

Conclusion

APIs are the building blocks that are critical to any company planning on remaining competitive in the new digital age. Our experience tells us that teams are already making compromises by not tackling foundational issues with API integrations. This tendency, we believe, is derived from fears that have been rendered obsolete by API frameworks.

The ability to rapidly deploy best practice API integrations that are both flexible for future change and unified for developer & customer ease is finally here.

At DreamFactory, we hope our experience has helped frame the problem we see many facing, but many struggling to diagnose and treat.

We would love to help you and your team overcome similar data challenges!