EducationSoftwareStrategy.com
StrategyCommunity

Knowledge Base

Product

Community

Knowledge Base

TopicsBrowse ArticlesDeveloper Zone

Product

Download SoftwareProduct DocumentationSecurity Hub

Education

Tutorial VideosSolution GalleryEducation courses

Community

GuidelinesGrandmastersEvents
x_social-icon_white.svglinkedin_social-icon_white.svg
Strategy logoCommunity

© Strategy Inc. All Rights Reserved.

LegalTerms of UsePrivacy Policy
  1. Home
  2. Topics

Known Limitations of Adding Data into Google BigQuery


Yufei Chen

Software Engineer, Principal • MicroStrategy


This article details the known issues that occur when the Intelligence Server is trying to insert data into Google BigQuery.

When inserting data into Google BigQuery, there are are two known limitations that will cause an error when creating and inserting data into intermediate tables.
 

  1. Maximum rate of dataset metadata update operations
    The "table update operations" limit is the rate limit for the number of changes to table metadata. This includes 'tables.{insert,update,delete}' operations, as well as anything else that modifies table storage, such as load jobs, query jobs with destination tables, and DML operations. 

    The 'maximum rate of table metadata update' operations is five operations every 10 seconds per table. So whenever the insert frequency is above five queries per 10 seconds, an error will appear. 

    For more information, see Quotas and limits. 
     
  2. Daily destination table update limit
    Destination tables in a query job are subject to the limit of 1,000 updates per table per day. So whenever more than 1,000 inserts are performed on a table in less than 24 hours, an error will appear.

    For more information, see Quotas and limits. 

In Strategy, some functionality relies on inserting data into the database to calculate final results. For example, say you are using a function that is not supported by the database. In this case, the Intelligence server would need to perform the function calculation and then write the data to the database for further calculation. 
Based on the above situation, you could create a User Defined Function (UDF) in Google BigQuery. Specifically, you could use multi-source access (or Heterogeneous Data Access, HDA), which is when you use Google BigQuery as a primary database and need to create an intermediate table to insert into Google BigQuery.
If you encounter a situation like the one above, you could avoid errors by using the other database as the primary database and create an intermediate table in that database. Alternatively, you could preload the data into Google BigQuery to avoid HDA.


Comment

0 comments

Details

Knowledge Article

Published:

September 10, 2019

Last Updated:

September 11, 2019