Hi @andatki! First of all, thank you so much for this amazing book!
I’ve found something that can cause a bit of confusion.
After seeding the data in order and running the bulk_load
script, the reader will have about 10M rows in the users table. But on page 56, after running the first statement to copy the scrubbed data into the users_copy
table, the operation will fail because of the statement timeout (at least on my computer ), because the book expects the users table to contain only 20200 records (the ones added with the
rails data_generators:generate_all
), but as I mentioned there are about 10M.
This shouldn’t be a problem for the majority of readers, but it can cause confusion and breaks the consistency a bit.