Survey Site Redux

The Great Architectural Survey of 2013 is complete. My students have now moved on to the infinitely more despised SPSS analysis and the painstaking InDesign layout. Still more information will come out of the survey in the weeks to come, but I can now start assessing how it went. In short: splendidly, with a few minor hiccups. A more detailed (and, predictably for me, long winded) assessment is below.

Students noted being satisfied with the new surveying process. They had to carry less gear, which was generally appreciated. The new system also allowed for much less time spent post-processing, re-copying, and generally doing doing busywork (for them, anyway.)

However, some problems did arise in the process of data collection:

Photography was sometimes an issue. One left-handed student took all photos upside down, which required a half-hour on my part spent rotating them each one by one. The photos, when taken on the mobile device and uploaded directly in the form, were not saved locally on the phone. Some teams had to go back out to re-photograph after upload failures. Uploading pics later on also caused problems. Some students mistakenly tried to upload 10 mb or bigger files, leading to freezes and crashing. Conversely, some sketches were scanned too low-resolution to be very clear.

While students didn’t have much post-processing to do, I most certainly did. In particular, there were duplicates galore. We weren’t able to figure out exactly what caused the duplicates, but they were pervasive. That said, they were also easy to fix. The language in the response fields also had some errors. For instance, if students typed return in an entry in the field, “rn” appeared in the field once uploaded. Some entries had quotations for a reason I still can’t figure out. Students also didn’t always follow direction to only include street name, not “street” or “avenue”, which meant more post processing for me (again.)

To fix all these issues, I had each team email me the problem entries after they were done with field work. That way, I only had to go in ten times to fix the problems. I think this worked well, in particular because it encouraged all the teams to go through a complete quality control step before submitting their work.

Most importantly, and 100% to Martha Burtis’ credit, her Wordress plugin to export the data to SPSS worked PERFECTLY. All I had to do was check the boxes for the fields I wanted, and voilà! The .sav file I wanted was exported just like that. Compared to my old 10 excel spreadsheets to SPSS exercise in frustration, this felt like a chorus from heaven as I clicked “export”.

So, what’s on the to-do list to improve this process next year? I have a few ideas, mostly based on students’ feedback:

– Integrate definitions in the survey. I have a pdf with definitions of important terms, and in theory students can open it on their phones, but links directly in the form would be even better. Not sure what “excellent condition” means? Click and find out. That would definitely help. I think it would be easily implemented, but I’ll have to ask Martha.

– Make sure photos are saved locally even when taken inside the form.

– Prevent left-handed people from taking their team photos on their phones (I am kidding. Kind of. Rotating all those pics was a pain.)

picture of Ned Flanders, famous lefty
Stupid Ned Flanders!

– Spend more time on the post-field work portion of the data collection. The comments section of the form, where teams could respond to each others’ entries, were actually more useful than I had anticipated. Students gave each other lots of great feedback, but I didn’t build in the time for peer evaluation in the submission process. Plus, they didn’t all know to put in their name to sign their comments. Next year, I want to make this a more formal part of the class, and give students time to respond to each other before exporting the SPSS data for analysis. I think this will be great both in terms of the quality of the data and for the learning process.

– The layout of the form could still use some improvement. Organizing the data was sometimes cumbersome and the resulting views not necessarily easy to read. For instance, I had thought the thumbnails would help, but honestly they’re so small they really don’t do much. Conversely, the “map views” were very useful since they allowed one to see all the entries by a given team, but they lacked important information in the list view.

All-in-all, we’re talking about very minor tweaking. There is no doubt in my mind that this system improved the data collection process: it made it easier on the students while also garnering better data. What’s not to like?

I must give the credit where it’s due: DTLT. Martha was incredibly responsive throughout the entire process, fixing issues as they arose. For instance, she added page views when the survey became so large the page loaded too slowly, and took out the logging limitations when they proved too cumbersome. In other words, and as usual, Martha kicked butt. Thanks, Martha!

Being so fortunate in having DTLT’s support, I’m already thinking of new fun projects to implement.

Leave a Comment