Import CSV Question

I have students that are using open data websites to find data sets and then create visualizations. We are importing CSV’s but some will not import. After we click the “import csv” button the progress wheel spins for a minute or two and then nothing happens. Is there a file size limit or other information I should be looking for? I tried to include the file to this post, but it says it’s too large. Here is a link to the site.

Not sure what the limit is but there definitely is a limit. Some of the practice levels provided last year were not functioning correctly because of large data sets. You may want to write to support@code.org to find out what the exact limit is.

I just found some documentation saying that the datasets have be be 10000 or fewer rows

I have a student that downloaded a dataset from Kaggle and it would not import. We opened it fine in Google Sheets and saw a lot of non-ASCII emojis in some of the columns. After deleting those columns in in Sheets, then downloading as a new CSV, it loaded into App Lab. My conclusion is that App Lab data import does not work with emojis, at least some of them some of the time!

Great insight @will_wright. Thanks for sharing!

if CDO does not have a way for accounting for UTF codes it will probably most likely fail so your assumption is probably correct. However, i’m curious if you could possibly get away with just storing all the escape codes as plain text "\\ud83c\\udf88" for example though this probably may be more work it may be worth a try if you don’t wish to compromise the original dataset

and then decode the column by either removing the double escapes or parsing the column strings like this

JSON.parse(('"' + "\\ud83c\\udf88" + '"'))

it’d be finnicky but if this does work no modification to the csv table should be necessary other than escaping all the UTF emojis manually

Varrience