You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When creating an order for a prebid with this script, we have a problem with DENSE granulation.
We wanted to run the script 3x to recreate the granulation:
-300 line item with granularity 0,01 USD. (0,01$-3,00$)
-100 line item with granularity 0,05 USD. (3,05$-8,00$
-24 line litem with granularity 0,50 USD (8,50$-20$)
However, when trying to create the first 300 line items, the script crashes with a connection error. The solution for this was to divide the first group of 300 line items into smaller parts of 100.
The problem is that the script runs 5x and creates creatives every time. So instead of the 14 creatives I wanted to have, I have a total of 14x5 (number of script runs). Is there any way around it?
The text was updated successfully, but these errors were encountered:
There's not a great solution for this right now, as noted in the limitations. #27 is the open issue for more customized bucketing.
When the connection drops, it's likely best to archive the order and run the full script again. Feel free to open a new issue if you keep seeing connection errors.
When creating an order for a prebid with this script, we have a problem with DENSE granulation.
We wanted to run the script 3x to recreate the granulation:
-300 line item with granularity 0,01 USD. (0,01$-3,00$)
-100 line item with granularity 0,05 USD. (3,05$-8,00$
-24 line litem with granularity 0,50 USD (8,50$-20$)
However, when trying to create the first 300 line items, the script crashes with a connection error. The solution for this was to divide the first group of 300 line items into smaller parts of 100.
The problem is that the script runs 5x and creates creatives every time. So instead of the 14 creatives I wanted to have, I have a total of 14x5 (number of script runs). Is there any way around it?
The text was updated successfully, but these errors were encountered: