In my recent work, I need to run some SQL snippet from Redshift on Google’s BigQuery platform. Since different data warehouses have a different recipe for SQL, the transferring work couldn’t be avoided.

Here comes some tricks:

field::VARCHARCAST(field AS String)
isnull(), nvl()ifnull()
unionunion all
field ILIKE patternUPPER(field) LIKE pattern
split_part(string, delimiter, part)split(string, delimiter)[safe_offset(part)]

In Redshift we can select columns like this:

  SQRT(score) AS new_score,
  new_score * 10 
FROM ...

But in BigQuery we couldn’t use column name from “AS”. The SQL in BigQuery should be:

  SQRT(score) AS new_score,
  SQRT(score) * 10 
FROM ...

And, BigQuery has the “WITH” clause to replace the “temporary table”, which is very powerful:

WITH result AS (
   WITH example AS ( SELECT * FROM `dataset.table` )
   SELECT * FROM example

SELECT * FROM result