21

Show DN: I made some tools to help you do remote UX research, everything open source

15 days ago from , Professor

Me and my students at Brown University have been making tools to help with UX research, specifically UX research you can do with users anyone in the world without having to interact with them. We spend years trying to make them ready for real world use, even after the papers are published. I'd love to have more people use them for UX research, tell us if it helps you with your job, or give us feedback / suggestions. We're funded by American tax dollars, and so we make everything open source. Here's what we've got:

WebGazer.js: eye tracking on laptops -- have your UX participants go to your websites, and track where they're looking, for how long, in what order, etc. Think of it as like a Tobii eye tracker but just using the laptops on your participants' computers, and have them do usability tests from their own home. Obviously, their consent is needed, and the accuracy is not as good as Tobii/SMI since it's a consumer rgb camera and not infrared.

https://webgazer.cs.brown.edu/

We offer the base eye tracking library so it can be integrated into any website. If you're looking for something completely end-to-end, there's a company (not advertised in this post) that wraps our eye tracker and sells it as a service at a modest price. But I think we have provided enough documentation for you to get things up and running with some programming experience.

If you're doing mobile UX, I think motion of the device tells a lot about user attention and we have a paper about motion as a signal at [https://jeffhuang.com/Final_Remotion_IMWUT18.pdf]. So we have a motion replay system, Remotion that basically replays the posture of the phone during a user session. Imagine being able to see when a user from afar puts down the phone, when they shake the device when they're agitated, being able to see when they turn their phone sideways to upright.

https://remotion.cs.brown.edu/

You can do motion replay with just the software, and that's open source and you can set it up yourself. But if you want the experience of seeing the motion replay (as if the user is in your lab holding the phone with a ghost hand), then that requires a sort of robotic arm to control a phone.

You can see version 1 on the website, but we're building the next version of it which has more axes of freedom of rotation. We'll send you the upcoming version at cost, just DM me / email hci@brown.edu: send your company/institution name + mailing address, and a promise you'll pay $150 when you receive it, and then later let us know about your experience using it. We'll build you one and ship it to you when it's ready (probably late Fall). It's $150 for the cost of parts and shipping; we 3D print the pieces and buy motors and components.

Anyways, I'm all about making it possible to do UX research with participants in their own home, with their own devices, so you get as naturalistic behavior as possible. Hope "advertising" here is appropriate, and any of these are helpful (let me know how it goes!).

4 comments