Changing IOPub data rate in Jupyterlab


Changing IOPub data rate in Jupyterlab



I will preface this by saying that I am very new to python and PostgreSQL, and programming in general. I am currently working on querying data from a PostgreSQL server and storing the data in a python program running on Jupyterlab 0.32.1. Up until this point, I have had no problems with querying this data, but now I am receiving an error.


import psycopg2 as p
ryandata= p.connect(dbname="agent_rating")
rcurr = ryandata.cursor()
rcurr.execute("SELECT ordlog_id FROM eta")
data = rcurr.fetchall()

mylist=
for i in range(len(data)):
orderid = data[i]
mylist.append(orderid)
print (mylist)



IOPub data rate exceeded.
The notebook server will temporarily stop sending output
to the client in order to avoid crashing it.
To change this limit, set the config variable



--NotebookApp.iopub_data_rate_limit.
Current values:
NotebookApp.iopub_data_rate_limit=1000000.0 (bytes/sec)
NotebookApp.rate_limit_window=3.0 (secs)


--NotebookApp.iopub_data_rate_limit



Can anyone help fix this?





I think you forgot to post the error. Also, make sure to post the code that is giving this error so we can see what could be causing it.
– AMACB
Jun 29 at 16:48





Oops, also new to this site. I've edited my original question.
– rankersen
Jun 29 at 16:54









By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Comments

Popular posts from this blog

paramiko-expect timeout is happening after executing the command

Opening a url is failing in Swift

Export result set on Dbeaver to CSV