As labor costs increase for orchards, growers are looking for ways to cut expenses without compromising quality of the harvest. One way to to reduce labor costs is to introduce automation into the harvesting process. Automation already plays a large role in the harvest of produce in Europe, and is being introduced into United States orchards in states like Washington and Pennsylvania.
Our client was a group of researchers at Carnegie Mellon who are working on a project called Comprehensive Automation for Specialty Crops (CASC). They are developing a self-steering harvesting platform, called an automated prime mover (APM), that could provide a cheaper alternative to the larger, more expensive harvesting platforms used in Europe. Our group was charged with designing a control panel for the APM.
The machine the researchers had designed was a simple Toro golf cart that had been fitted with a platform that moves up and down to allow workers to harvest fruit at different levels of an apple tree. This cart is programmed with a sensor that allows it to sense its position in the orchard row and steer itself. The APM was in need of a clear, intuitive control panel and interface that would require minimal effort for workers to operate.
Our group began the design process by meeting with the client, getting an overview of the project, and conducting an affinity diagramming session to tease out the salient issues stakeholders face. From the diagramming session our group developed three areas of focus: understand external factors that affect the harvesting process, understand worker demographics and interactions that affect the harvesting process, and understand the manual and automated operations of pruning, thinning, and harvesting.
We then arranged our first contextual inquiry at Penn State Fruit Research and Extension Center’s (FREC) research orchard in Biglerville, Pennsylvania. In this contextual inquiry we were able to see the N. Blosi harvesting platform on which the Toro automated prime mover was modeled. We also gained insight into how workers conduct pruning, thinning, and harvesting operations while operating the N. Blosi.
Next, we conducted a second contextual inquiry at a traditional orchard. Workers in this orchard use ladders to move up and down the tree and harvest apples. This contextual inquiry helped us understand the physical strain that traditional harvesting operations can put on workers, and which parts of the harvesting process could be made easier with automation.
After we finished conducting contextual inquiries, we began to consolidate our data to see patterns in the work flow, task sequence, and culture of the orchards. We represented this consolidation of data in models noted the breakdowns in task sequence, work flow, and communication.
Some of the most important breakdowns were:
- the controls on the N. Blosi give little indication of their function and provide no feedback about the status of the machine
- the worker experiences role strain when expected to steer a machine and pick apples at the same time
- workers sometimes have trouble deciding on a speed that accommodates different picking paces
- it is difficult for workers to see which way the wheels on the machine are turned, which makes workers likely to move the N. Blosi in an unexpected direction
Designing from Data
The consolidated data and the breakdowns we discovered provided inspiration a visioning session. The ideas we came up with provided the basis for a low-fidelity prototype. We addressed the wheel position feedback breakdown by providing a display of the machine’s current wheel position. In order to facilitate agreement and productive conversation about vehicle speed, we display the speed prominently on the interface. There is also a voice command system that frees workers to focus on picking fruit and not on driving the machine.
User Testing and Further Design Iterations
We brought this prototype to Soergel orchard to do a think-aloud test on 3 workers. The major problems were confusion about how to use our joystick to move the vehicle forward, confusion about how to use our system of voice commands. Workers we tested were happy with our controls and feedback for platform height, as well as with the joystick’s indication of automatic driving mode.
With the feedback and data we gathered from the think-aloud test, we wrote UARs (usability aspect reports) on the critical incidents that occurred with each user. We rated each incident as minor to severe. These reports guided us in selecting aspects of our design to modify or eliminate.
Once we finished the second iteration of our prototype, we brought it to test with workers at the Biglerville Orchard. We built the interface prototype in Flex, and augmented it with physical steering and speed controls. At the orchard, we conducted a Wizard-of-Oz style user test. This test allows users to interact with the controls and interface, but members of our team actually controlled the changes in the interface and the position and speed of the N. Blosi platform.
Overall, workers were pleased with the final interface and controls. One of the main problems we noted was the visibility of our interface in sunlight. Workers had difficulty seeing the interface display so it was hard for them to see the visual feedback that resulted from their use of the manual controls. The correct way to extend the sides of the platform was also not obvious to workers. On the other hand, workers found that use of the joystick control was clear and intuitive, our voice commands were easy to use, and switching to reverse gear was logical and natural.
After our final user test we presented our findings to our client. He was impressed with our final product and has now requested that another group of students further develop our design and prepare it for implementation on the automated prime mover.
User Research | User Testing | Model Making | Prototype Building
Contextual Inquiry | Concept Generation | Prototyping | Wizard-of-Oz | Usability Aspect Reports | Think-Aloud Usability Testing | CogTool | Omnigraffle | Illustrator | Flex
Marcel Bergerman of the Carnegie Mellon Robotics Institute
Human-Computer Interaction Methods
Julie Bae, John Michael Flowers, Tiffany Ng, Nastasha Tan