We've been getting hit hard on Google Search Console data being too overwhelming for the APIs to handle.
However, there are lots of API services out there that offer a staggered approach to API querying over large amounts of data that successfully bypasses these issues.
Their approach is to simply organise queries into multiple automatic queries. Example: when you get the "Last 13 Months" of data, GSC via Ninja Cat quickly times out if your client has a huge site because it just cannot complete the request of such a large amount of data.
To bypass this, some services simply split the queries into more manageable bite-sized chunks. Example: if we are consolidating a chart by month, instead of querying for data for every single day of every single one of those 13 months, why not simply code an improvement to the API call where if the request is consolidated into months, that we query the API one month after the other, in an automatic fashion in the background. Same could be done for week, this way we're not having to call the bailiff on the integration and relocate the data elsewhere.
Hopefully I made sense. I'd be happy to demo what I mean with another API tool for you.