I’m having trouble with timestamp data from Airtable in BigQuery. When I upload it, the dates turn into weird decimal numbers. Here’s what I mean:
Decimal format | Actual date
-------------------|-----------------------
44493.61666667 | 2021-10-24T14:48:00
44522.775 | 2021-11-22T18:36:00
44493.67916666667 | 2021-10-24T16:18:00
44522.739583333336 | 2021-11-22T17:45:00
Does anyone know how to fix this? I need a query to change these decimal numbers back into normal dates and times. The table name is Airtable if that helps. I’ve been stuck on this for hours and can’t figure it out. Any help would be awesome!
I’ve encountered this issue before when working with Airtable data. The decimal format you’re seeing is indeed a representation of dates, but it’s based on a different epoch than what BigQuery typically uses. To convert these to standard timestamps, you’ll need to adjust for the difference in epoch start dates.
Here’s a query that should work for your Airtable data:
SELECT
TIMESTAMP_ADD(TIMESTAMP ‘1899-12-30’, INTERVAL CAST(your_column * 86400 AS INT64) SECOND) AS converted_timestamp
FROM
Airtable
This query takes into account that Airtable’s epoch starts on December 30, 1899. It multiplies the decimal value by 86400 (number of seconds in a day) to get the total seconds, then adds that to the epoch start date.
Remember to replace ‘your_column’ with the actual name of the column containing the decimal timestamps. Let me know if you need any clarification on this approach.
I’ve dealt with this Airtable timestamp issue before, and it can be a real headache. The trick is understanding that Airtable uses a different date system than most databases. Here’s what worked for me:
SELECT
TIMESTAMP_MICROS(CAST((your_column - 2440587.5) * 86400000000 AS INT64)) AS converted_timestamp
FROM Airtable
This query converts the Airtable decimal to Unix timestamp microseconds, then to a proper BigQuery timestamp. The magic number 2440587.5 adjusts for the difference between Unix epoch and the Julian date used by Airtable.
One thing to watch out for: make sure your data types are consistent. If you’re getting any errors, double-check that your_column is being read as a FLOAT or NUMERIC type in BigQuery. Also, you might need to adjust the precision depending on your specific use case.