Auction Houses updating at the same time

Hello,

Does anyone know why Blizzard made it that almost all Auction Houses are updating at the same time? In the previous API it was very nice spread out, but now I am calling 242 Auction Houses at the same time basicly and my server does not like it.

I am now processing 17.171.594 rows every hour within basicly a few minutes of eachother haha.

/love Grumpymuppet

Yeah I’m not a fan either, the old way spread out the load a lot better. It happened around the time they updated the AH in-game w/the commodities stacks.

Interestingly, if you look at the auction IDs across all the connected realms, they’re clumped together a lot more than they used to be. Across all US connected realms there are 9 different groups of auction IDs, which leads me to believe that groups of 8 to 18 connected realms at a time all share the same backend database, when perhaps before the change, each connected realm had its own backend database. Just a guess.

I would absolutely not mind having this spread evenly throughout the hour. Or at least a bit more spread out through the first 30 minutes of the hour or something.

1 Like

@ukon @erorus Maybe we can get a Blizzard reply on this, because I am pretty sure this won’t do them any favors either. Those who use the API will all call , all realms at the same time now.

Like all US/EU realms fall within 2 minutes for me. EU on 08 and US on 06 for me. That is 241 calls within 2 minutes with over 17 million rows of data.

They’re aware. See the last quote reply in this post: World of Warcraft API Update - Visions of N'zoth - #15 by Araspir

@erorus haha “but whatev”. Thanks for making our voice heared. However I think he understood it as “you guys might get issues doing this within the same minute” altho the issue will be most likely on our side because our servers are not as strong. Atleast mine is not. I need 100%+ of my server capacity if I want to update them all at the same time, which im not doing ofcourse.

I have the same issue as author of this thread long time ago, but then I learn how to write my code async, so as for now I update ~200 EU realms as bulks in 20, like

UPDATE 200 REALMS
  ==> UPDATE FIRST 20
  ==> UPDATE SECOND 20
  ...
  N

By the way, take a look at http://reactivex.io/ library, and some of it’s operators, it could help you to solve a problem, when all auction houses are updating at once and DB on inserting new data flooding your RAM and load CPU up to the skies.

So, to be honest it’s not a real problem that devs should look in to. As for me, the real problem is recipe endpoint and long updating of data between auction timestamps (as for now, it’s nearly 3-4 hours, instead of old 45m - 2H.

Other than EU-Aman’Thul (which hasn’t updated in 10 days), auction snapshots have been hourly pretty regularly: hxxps://theunderminejournal.com/dataintervals.php

1 Like

Yes you right, but I guess you won’t deny that sometimes the situation with auction house endpoint not so «bright».

I remember how ~3 week ago, I have only 5-6 timestamps during last 24H. Anyway I’m glad that endpoint is working. And as for now, it provides data every hour: :blush:

  { _id: 1595092109 },
  { _id: 1595088510 },
  { _id: 1595084909 },
  { _id: 1595081310 }

Personally the biggest issue with this for me, is the database(RDS).
It does not like me pushing all the AH’s processed data for too many AH’s at once :stuck_out_tongue_closed_eyes:. I’m running serverless everywhere else, so the compute is not an issue for me. It auto-scales.

It would be something like 123 212 unique items with bonuses, spesicesId, etc * 242 = 29 817 304 per hour, within a 10-minute timespan. So this is probably something that could be solved, by paying more for a more powerful database. But that is not tempting when I earn nothing from having my app running :rofl:

But the raw auction house data itself, I personally have moved out to a CDN (AWS S3) per region. So I just download the AH data, dump it to a file, and trigger a Lambda function to process it etc. Then the users can just get the data ASAP as it’s ready in the S3 bucket…

Like AlexZeDim, I fetch AH data in bulks of 20 but spread out over a minute and repeat this until it’s done.

I do prioritize updates on when users last requested them etc. So if someone, is currently using the app for a specific realm. Then they should get it as soon as the dump is available from Blizzard.

1 Like

@AlexZeDim

I don’t reconize the delay you describe and I do manage to update all realms at once without flooding my CPU too much, however it is so not needed. There is absolutely no reason why all realms should be updated within the same timeframe. Spread them out evenly in the hour and no one will notice the difference besides our and blizzard servers.