I’m experiencing issues with date conversions when transferring data from Airtable to BigQuery. The datetime values that are accurate in Airtable turn into decimal numbers upon being uploaded into BigQuery.
Here’s a table showcasing the conversion results:
Numeric Value
Expected DateTime
44493.61666667
2021-10-24T14:48:00
44522.775
2021-11-22T18:36:00
44493.67916666667
2021-10-24T16:18:00
44522.739583333336
2021-11-22T17:45:00
I need assistance in crafting a BigQuery SQL query that can convert these decimal values back into the correct timestamp format. My table is named data_import, and the decimal column is referenced as date_field.
What would be the most effective way to perform this conversion? I’ve been stuck on finding a solution and would appreciate any guidance on the appropriate SQL query syntax.
Had this exact problem when working with legacy data from an old inventory system. The decimal format you’re dealing with represents days since December 30, 1899, where the integer portion is the date and the decimal portion represents the time fraction.
The key difference from other approaches is using TIMESTAMP_ADD with microseconds for better precision. I multiply by 246060*1000000 to convert the decimal days directly to microseconds. This method preserved all the time precision in my case and handled the fractional seconds correctly. Make sure to check your timezone settings in BigQuery since this creates UTC timestamps by default.
Been there with a similar data migration nightmare from a CRM system a few years back. Your decimals are Excel serial numbers, and I found the cleanest approach is using DATETIME directly without all the microsecond complexity.
Here’s the query I use:
SELECT
date_field,
DATETIME(DATE_ADD(DATE('1899-12-30'), INTERVAL CAST(date_field) DAY),
TIME(0, 0, CAST((date_field - CAST(date_field AS INT64)) * 86400 AS INT64)))
AS converted_datetime
FROM data_import
This separates the date and time parts cleanly. The CAST(date_field) gets the whole days, while the subtraction gives you the time fraction that gets multiplied by 86400 for seconds.
I prefer this method because it’s more readable when you need to debug or modify later. Plus it handles edge cases better than cramming everything into one conversion function.
Test it on your sample data first, but this approach has saved me from conversion headaches multiple times.
honestly just use DATETIME_SUB(DATETIME_ADD(DATETIME('1900-01-01'), INTERVAL CAST(date_field) DAY), INTERVAL 2 DAY) and add the time part seperately. works every time for excel dates. the -2 days handles the leap year bug thing automatically without thinking too hard about it
Actually went through this same headache when importing financial data from Google Sheets to BigQuery last month. The issue is definitely those Excel serial date numbers, but I found a different approach that handles timezone considerations better.
SELECT
date_field,
PARSE_DATETIME('%Y-%m-%d %H:%M:%S',
FORMAT_DATETIME('%Y-%m-%d %H:%M:%S',
DATETIME_ADD(
DATETIME('1899-12-30 00:00:00'),
INTERVAL CAST(ROUND(date_field * 86400)) SECOND
)
)
) AS final_datetime
FROM data_import
The key difference here is using ROUND instead of just CAST, which prevents rounding errors that can throw off your seconds by a few digits. I also wrap it in PARSE_DATETIME to ensure consistent formatting. This method gave me exact matches when I verified against the original timestamps in my spreadsheet. Worth running a quick validation query first to make sure your decimal precision isn’t getting lost somewhere in the conversion pipeline.
I ran into something similar last year when migrating from Excel to BigQuery. Those decimal values you’re seeing are actually Excel serial dates, which count days since 1900-01-01. Since Airtable uses the same underlying format, you’ll need to convert them by adding the appropriate number of days to the epoch.
The solution is to use DATE_ADD with your decimal values. Try this query:
SELECT
date_field,
DATETIME_ADD(
DATETIME('1899-12-30'),
INTERVAL CAST(date_field * 86400 AS INT64) SECOND
) AS converted_datetime
FROM data_import
Note that I used 1899-12-30 instead of 1900-01-01 because of how Excel counts leap years. The multiplication by 86400 converts the fractional part into seconds. This approach worked perfectly for my dataset and preserved both the date and time components accurately. You might want to test it on a few rows first to verify the timezone alignment matches your expectations.