How to ingest huge amounts of data into a GraphQL API

By

When working with GraphQL APIs the ultimate goal is to connect these with your services so that data exchange is done automatically. However, when setting up a GraphQL API, and the backend services, you first want to test everything before connecting it to other services. 

Usually that is done using Insomnia or Postman to manually send a few requests and to see if everything works as expected. If you also want to see how the system performs at scale or if you want to do the initial load of the system before connecting it to the real time stream of data to be ingested via API, you either need to write code to batch ingest the data or find another solution, as this is not possible using Insomnia, Postman or other similar tools.

We faced the same problem at Tilores as we are often running PoCs for customers and have to ingest millions of records at a time. First we created an upload screen in our SaaS UI, but that was intended for small file sizes and slow upload speed.  Otherwise the browser would crash or freeze, so we limited that to a file size of 10 MB. For small clients that was sufficient but for clients with more than a million records this did not work out.

After discussing this issue internally we decided to build a tool that reads the input file and uploads it to a GraphQL API. As all our APIs need authentication, we had to build in authentication and other necessary features as uploads can take some time. When uploading huge amounts of data you also want to use the advantages of parallelisation to speed up the process. By sending concurrent requests you can also load-test your API if needed.

The result of all of this is our open source batch-graphql tool which you can use yourself for uploads and queries of GraphQL APIs.  

Related

Posts

Explore Similar Articles

The API to unify scattered customer data in real-time.

Get the latest updates

©2025 Tilores, All right reserved.