Spark SQL nested query


Spark SQL nested query



I have the following spark sql query


SELECT count(*),
channel
FROM channelusage a
WHERE a.starttime>= windowstarttime
AND a.endtime <= windowendtime
GROUP BY channel



I have to generate these counts for 10 windows. Currently I use a while loop to generate the windowstarttime and windowendtime.



What I want to do is - I want to generate the windows in the sql query itself something like a nested sql similar to this -


SELECT count(*),
channel
FROM channelusage a
WHERE (nested query logic)
GROUP BY channel



so that I get a output similar to this


windowstarttime | windowentime | channel | count
11:00:01 | 11:00:10 | ABC |2
11:00:11 | 11:00:20 | ABC |4
11:00:21 | 11:00:30 | NBC |10
11:00:31 | 11:00:40 | CNN |5





What is your input data set?
– Kannan Kandasamy
Jun 29 at 20:24





input is required here
– thebluephantom
yesterday









By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Comments

Popular posts from this blog

paramiko-expect timeout is happening after executing the command

Opening a url is failing in Swift

Export result set on Dbeaver to CSV