Bridging ideas and reality with ProtoPie
Creating rich interactive prototypes has often been a struggle for designers who can't code or don't have access to a developer. We explored using ProtoPie to create highly interactive prototypes without touching a line of code.
I love trying out new design tools. Particularly ones that allow me to explore new ideas fast and that don’t need a lot of time spent to learn. ProtoPie’s no-code interaction prototyping surprised me. It's so easy to design interactions far more advanced than a hover or click state within minutes.
The way it works is simple, each interaction in ProtoPie combines triggers with responses. Triggers can be touch, mouse, inputs and key presses, mobile sensors, or conditional logic. Responses are then applied onto triggers and can be any combination of transforms, style or logic changes.
It's a simple but robust system. Elements transformed to the mouse position or the appearances of an element changing on scroll can be prototyped in minutes. These are two classic examples of interactions that in the past have only been easy to prototype with the help of a developer. The best part with ProtoPie? They all came together in minutes and all done without needing to write a line of code. You can definitely add custom code if you want to take things to the next level.
ProtoPie also has a feature called bridge that allows two prototypes to communicate with each other. Imagine for example prototyping and user testing a chat UI. The participant types a message into a text input on Prototype A and hits send. The same message then appears on Prototype B in realtime. This is pretty powerful stuff that can be built without too much effort.
We recently put bridge to the test while we explored a BYO device approach to museum exhibit interaction. First we set up the control interface on the user's device, in this case a circle sitting in the middle of the screen acting like a joystick. Next we set up the museum exhibit, in this case a screenshot with the same circle from the remote sitting on top.
After 30 minutes or so of setting things up the magic happened. When moving the joystick on the remote, the prototype sends a command which has the coordinates of the joystick position. The exhibit is always looking for this command so when it picks it up it moves its own cursor to the same position as the remote cursor. Pretty cool! But the remote has a different screen dimension to the exhibit right? The signal that Prototype A sends can also be manipulated using a formula. In this case we want to scale up the x and y coordinates to match the dimensions of the exhibit prototype. Easy as!
The final result was a prototype that was extremely quick to set up. It gave us a greater understanding of how it would feel to use a mobile phone to control a cursor on another device. It also allowed us to think about the finer details of the interaction earlier in the process without needing to develop a prototype in code.