We have a Datafactory pipeline in Azure to move a on-premise SQL table to Azure blob storage Gen2 in parquet format. I think the majority cost would come from the Azure storage, right?
Now we want to move those data to BigQuery. Due to our security policy, we still need the Datafactory pipeline to read from SQL table. So we create a databrick notebook to read the parquet file and move to BigQuery using the Spark BigQuery connector. Now I need to estimate the total cost. On top of the Azure storage, do we have to pay some kind of egress cost to move data out of Azure storage? And does google would charge us some kind of ingress cost to move data to BQ?
question from:
https://stackoverflow.com/questions/65832481/how-to-calcuate-the-cost-when-moving-data-from-azure-datalake-to-google-bigquery 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…