Iteration 2

Group Members: Ahmed, Karthik, Kattie

----

In my last project iteration blog, I noted that our team focused too much on trying to represent our bar chart data haptically, and not as much on utilizing the modality itself to enrich the visualization experience such that part of the user experience would be lost without the haptics. We kept that point in mind for the second iteration, even though we weren't initially clear on our objective.

I luckily received my Haply just a couple of days after the first iteration; this gave me the chance to work on force-feedback for this iteration. Assembling the Haply wasn't too problematic once I realized that using my own screwdriver set helped more with the screws than using the Allen key that came with the package. I somehow managed to snap one of the motor wires, but rejoining it fortunately wasn't a problem. Altogether, assembly took approximately an hour and a half.

Fixing a snapped motor wire during assembly.. yikes.

After trying out some of the Haply library examples to familiarize myself with the dynamics of the device, I tried Kattie's codes for iteration 1 as it was the only bar chart example from our group that utilized the Haply. The experience confirmed what I wrote in my first very sentence of this blog post: while the force-feedback from the bar "walls" felt satisfying (I know, it's difficult to convey feelings when using a haptic device), it was difficult for me to feel the trend in data and keep track of where I was in the chart without the visuals.

We believed that using a scatter plot would be a more appropriate graph to use for this iteration. It gives us more flexibility with using the haptics as there are more dimensions of data to work with compared to a pie or bar chart (thus allowing us to, for example, offload one of dimensions of data to the haptic modality). It would allow us to solve a potentially interesting problem I brought up in my previous blog spot: could the haptics be used to navigate to a point in a large scatter plot quicker than with using a mouse? Thus, we decided to work with a scatter plot for this iteration. 

The gicenter visualization library we used in the previous iteration provides support for scatter plots, but I came across another package, grafica, that provides more functionality, such as panning and zooming, which might be useful for future tasks. Importantly, it came with an example we believed would be perfect to try out our haptics code on: a life expectancy scatter plot - familiar data, plenty of data points, and several dimensions of information.

Using the Haply to navigate to a particular point seemed like a good start to enriching the visualization experience. Although I received my Haply late and thus wasn't familiar with its workings, I familiarized myself with haptic guidance by playing around with the PID code provided by the instructors (the code from lab 4). The first step was to integrate this plot into the PID code, which was a relatively simple task. The idea I then had was to load the data, allow the user to select a point of interest (e.g., the name of a country through a textbox control or speech synthesizer), and map the corresponding data point from the scatter plot to the coordinates of the "setpoint" of the Haply's target. The end-effector would then navigate to the setpoint "automatically" via PID control with the PID parameters tuned to particular values. To "map" the coordinates from the scatter plot system to the Haply one, I determined the corresponding coordinates of three roughly equidistant points in each system by printing their values to the console, and plotted their relationship on Excel. The relationship turned out to be linear, so all that was required was a simple y = mx + c style mapping.


Here is a video showing the effect:

It's somewhat hardcoded and far from perfect, but I was surprised at how well the navigation worked on the first attempt by tuning only the P parameter, regardless of the distance between the two points; I thought I would have to deal with the headache of fine-tuning all PID values and the end-effector potentially going unstable. My other teammates reported no issues, although Kattie recommended I include the derivative control as the higher stiffness of the spring would allow for a better "landing" at the setpoint.

Secondly, I wanted to experiment with the ability to explore the trend of the data using Haply force-feedback effects such as "stickiness" . While this could be a useful way for a visually impaired user to access the data, the main purpose was to establish some foundation code to easily add interaction effects for future tasks. I wanted to add tactile feedback in two ways: 1) force-feedback using the Haply, as mentioned above and 2) vibrotactile actuation while still exploring the data using the end-effector of the Haply (using the same linear resonant actuator (LRA) that was used for the bar chart in the previous iteration). In the second case, the vibrotactile actuator would simply be attached to the top of the end-effector. Since the LRA is connected to the audio system of my computer via an amplifier, it would be driven by audio; therefore, the minim sound library would be used to generate tones for the vibration.

For both scenarios, the intensity of the feedback (vibration or damping from the Haply) depended on the sum of square of the inverse distances from the Haply's current position. This is based on a formula used by Paneels & Robert [1] for a similar task performed using another force-feedback device. As a starting point, only the y-axis distance is used in this code. Since the vibrotactile actuation was already something I had worked on, it was pretty easy to implement. A video of it is shown below:

On the other hand, working with force-feedback in the PID code proved to be a lot more difficult. I was unable to incorporate the damping effect in the simulation thread of the PID code. I was able to "fix" this by including a separate thread for each type of simulation and render each based on a "mode" triggered by a keypress, while also using a separate Haply avatar and mapping system for each mode. 

The video below shows the effect:


Thoughts:

Unlike the previous iteration, which felt more experimentation based, I feel like the work presented here has more "practical value". Working with the Haply was, at times frustrating, but mostly a fun experience. Coding the navigation task was simpler than I had imagined and the results seem pretty good. On the other hand, figuring out a way to incorporate interaction effects for force-feedback in the second task took an entire day of debugging. It would be interesting to see what others think of the navigation experience, as I certainly believe it provides a unique experience of its own (goes back to the "is the interaction experience lost without the haptics?" point) and would help users to navigate to a certain point quicker than the auditory/visual modalities. Unfortunately, we weren't able to request other course mates to test the effectiveness (or lack thereof) of these functions due to time constraints, but nevertheless, we could gather feedback during the next iteration of work.

For the second task, i.e., tactile exploration of data, I felt that vibrotactile actuation did a better job at conveying the trend of closeness of points, but the function itself did not seem too useful. Even for a visually impaired user, the effect may not be valuable since it is still difficult to feel your position on the scatter plot. Perhaps we could gain input from visually impaired users and work from there. Karthik tried out something similar and thought it was easy to tell when he was at the "center" (mean) of the data based on the vibration intensity, suggesting that the effect could be promising for providing additional information such as statistical measures. Kattie suggested adding haptic effects to overlapping data points, which seems like an interesting problem to work on, but later reflected that this was difficult to do precisely using haptics. There are some other ideas I can think of, such as generalizing the code to work with other data sets (in so far as extracting information from an image on a website), or working on other graphs such as line charts or scientific diagrams. One problem that is still prevalent with haptically representing line charts is the confusion in distinguishing between lines that have some overlap [2].

One of our project supervisors noted in a meeting that we may have crippled ourselves somewhat by using the "gold standard" dataset, an interactive visual representation of data that the haptics would have a difficult time "competing with". However, even if we can make the haptics feel useful in its own right, such as using the Haply to quickly navigate to a particular point or buzzing an actuator in various ways to represent information not directly presented in the graphs (such as the mean, median or mode), the experience should feel richer.

In any case, we might gather feedback during our demo in class today [29th March] to see if our course mates or the instructors have interesting insights that could inspire us for our final iteration work (and perhaps even after). Now that I have a better understanding of the Haply, I'm interested in seeing what we can do, especially since I have vibrotactile actuators to utilize as another form of feedback.

---

Code can be found here. For reference:

Mode 0 (default, pressing 0 on the keyboard): Allows you to navigate between points using PID effects.

Mode 1 (pressing 1 on the keyboard): Allows you to feel the data 'vibrotactically'. Vibrotactile actuators are required to feel the effects, but since the system works on audio, users can still hear changes in the audio's amplitude based on their position in the graph. This may require changing a variable, device_num, at the very top of the code to match their computer's audio output device. Only tested on Windows.

Mode 2 (pressing 2 on the keyboard): Allows the user to feel the data via force-feedback damping effects. This was rushed, so it is just a rudimentary sketch for now.

The Haply should be placed at its "origin" before testing out a different mode.

Thanks for reading.

References:

[1] Panëels S.A., Roberts J.C., Rodgers P.J. (2009) Haptic Interaction Techniques for Exploring Chart Data. In: Altinsoy M.E., Jekosch U., Brewster S. (eds) Haptic and Audio Interaction Design. HAID 2009. Lecture Notes in Computer Science, vol 5763. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04076-4_4

[2] S. Paneels and J. C. Roberts, "Review of Designs for Haptic Data Visualization," in IEEE Transactions on Haptics, vol. 3, no. 2, pp. 119-137, April-June 2010, doi: 10.1109/TOH.2009.44.

Comments

Popular posts from this blog

Lab 2 - Generating a maze

Iteration 1