All posts by Anna W.

Seeing Climate Change through Google Search

We recently released a project in collaboration with some brilliant folks at Google Trends and WebGL extraordinaire Michael Chang. Google Trends, based on Google Search data, takes searches (terms and queries) and shows how often that search is entered compared to the total search volume across various regions of the world. These outputs can also be seen as time series data since 2004 and the search’s related terms or topics.

Big Data Snapshot

When Simon Rogers approached us earlier this summer and asked if we’d be interested in a complex project with a lot of moving parts and unpredictability on a short timeline— we said yes. Google Trends wanted to create an experience based on Climate Change queries to unveil their new API to the public at the GEN Summit conference in Barcelona.

The Global Editors Network (GEN) is a community of editors and innovators dedicated to creating programs in order to reward, encourage and provide opportunities for media organizations and journalists in the digital newsroom. GEN Summit is an annual conference that brings together the “leading minds of the media industry to discuss the future of news” and showcase some of the innovations being made in the space.

So, a slightly different audience than we are used to. How could we create an interactive experience geared toward journalists? How could we layer billions of searches from all over the globe and add some visual flare all the while respecting journalism’s obligation to inform clearly?

The data from Google Trends is anonymized, aggregated and normalized, allowing reporters to find and compare salient points in worldly matters. We compiled a list of related topics to climate change searches (like Drinking Water, Air Pollution and Wildlife) and we compared the quantitative data (volume of searches over time) with the major cities of the world to see how ‘important’ topics were in these places. We also had really interesting qualitative data: the literal queries entered into the search field.

Query Data Large Snapshot

New York

While the detailed data on small towns and volume of searches in major cities were really interesting, these queries were the home run, so to speak. In caring about the civic health of our cities and nations, journalists would have the potential to identify some of the trending questions before major issues bubble to the surface without a proper platform to debate them.

Our final interactive experience is a linked experience between a multi-touch wall and four Chromebook Pixels.

Screen Shot 2015-06-17 at 10.47.56 AM

The multi-touch wall shows queries popping up as the globe spins. Using real data, the piece simulates the activity of users constantly querying Google on our eight topics around global warming. The Chromebook Pixels allowed users to dive into some of the more qualitative data and excerpts of recent publications on topics in these major cities and towns around the world.

Screen Shot 2015-06-18 at 2.17.20 PM Screen Shot 2015-06-18 at 2.17.44 PM

As expected, we were prototyping and tweaking visual and interaction design daily during the final two weeks leading up to the GEN. All our work paid off though, and the Summit went smoothly and the piece was well received. Thanks to our collaboration with Chang, another version of the Chrome Experiment of the multitouch wall was showcased a few days later in New York City.

setting_the_scene_1024 slack_for_ios_upload_1024-1

See the project here.
You can also read the write up about the project in the Washington Post here.

Cheers To The New Year!

2014 was a big year for us.
Us,” perhaps being the keyword here.

Not only did we make the move from the Berkeley office to the new space in Oakland, we welcomed the addition of Samuel, Katarina and Anna, expanding our team to six. Of course, we’ve had our share of growing pains but over the last months we’ve found there’s no better way to hash these out than over a few brews in our new hood. And so, together, we explored Oakland with good food and great beers.

Screen Shot 2014-10-29 at 3.17.50 PM


Our team had been throwing holiday card ideas back and forth for the last few months to celebrate (surviving) the year. We knew for certain that we wanted to share our favorite local spots with friends that have watched us evolve over the last several years. A lot of these friends whom we have bonded with over food and drinks all over the world. In an effort to connect one step further, we took a look at where these friends call home…and where we’d drop in for a drink or a bite should we visit.

Of course, as we aren’t familiar with some of these areas, we wanted to make our best guess and choose locations that are worth the while. Ultimately, we used FourSquare data to find the most popular places visited within 1 mile* of each addressee. From there we manually curated the list to what we thought were the top 8 places. To compare the venues, we then defined an index to weight these locations by popularity (number of people who visited), rating, price tier, and distance from the home or studio.

Screen Shot 2014-10-30 at 4.33.38 PM

_1

Our designs evolved through many iterations but remained fairly simple. We plotted the locations on a map and played around with using indicators, such as cross streets, as orientation. In the end, we allowed the locations to seemingly be suspended around the origin – letting the varying sizes give a hint of depth to the visualization.

Screen Shot 2015-01-05 at 4.14.58 PM

The circle sizes generated by the index value for each location naturally emulate holiday ornaments as they hover about the origin location. The front of the card layers both Pitch Interactive locations as well as the locations of the addressee while the inside individually breaks out each visualization. The origin point remains in the same physical location on each panel and the gradient fill from the front also coincides with the location fills on the inner panels.

IMG_8672 copy

We wanted to challenge ourselves by doing a print piece in house. We manually registered, folded and scripted each address by hand. For a special touch that we thought would be appreciated, we licked the envelopes ourselves.

IMG_8677_s

We’ve enjoyed the responses received over the holidays – thank you and congratulations to all on the completion of another year. We hope 2015 will bring us together with many of you again. Should you find yourself in Oakland say, around happy hour, you know where to find us.

 

*For some locations, we extended our radius up to three miles

Synaptic Motion: Us and The Brain

The brain is a fascinating mass of constant connections, or synapses, where neurons are constantly forming constellations to help us make sense of our world. We were intrigued at the thought of helping visualize what the brain’s activities look like in a live performance when approached by founder and choreographer Jodi Lomask of Capacitor, a performance art company that explores non-traditional combinations of arts and sciences through movement. Their latest project Synaptic Motion debuted at the Yerba Buena Center in San Francisco this September.

Synaptic Motion intertwined music, visuals, and dance to show the brain’s many complex processes.

Screen Shot 2014-10-16 at 11.44.55 AM

The performance comprised of custom composed audio, a live MC, an array of costumes, dance, large-scale movement sculptures, and floor and wall projections.

Jodi had EEG data recorded from her brain at the UCSF Neuroscape Lab in order to see what her thought processes looked like and how the outputs changed as she moved from her most active states to sitting completely still. We analyzed the scans and began to ideate graphical metaphors representing the transitions of the brain and the different phases of thought. With EEG data in hand and the vision in mind, the new challenge was synchronizing the brain and sound wave data with movement.

FullSizeRender_BWSound artist, Oni Martin Dobrzanski, composed a soundtrack with specific pieces for each scene of the performance that audibly represented how the brain responds to hopelessness, a caffeine rush, a seizure, or an idea. Each song had so many granular layers that we felt our visualizations would be most complementary if they were layers of simple shapes that formed grids and dynamic compositions.

FullSizeRender3_BW

Using Processing, we loaded Dobrzanki’s audio and synced our sketches to the frequencies. But, being that we are a data visualization studio and new to the dance production process, we felt we needed to better understand what it means to be in performance arts. So, our team dropped by their rehearsals to speak with the dancers, scientists, and media artists. We demoed initial sketches at a few of them and felt good about the direction we took and so we continued to iterate.

For one of those iterations, we developed a ‘constellation’ algorithm that generated lines moving through a grid to mimic connections made between neurons. Whenever the lines passed through the composition’s threshold a new constellation is generated. This visualization ended up being one of the favorites despite its simplicity. To make it more dynamic and even more awesome we made a 3D version.

Constellation8_3DRotation_cropped

We encoded, packaged, and sent our favorite animations to Mary Franck, the projection artist of Synaptic Motion and the mastermind of the show’s visuals. Mary pieced together our work, weaving her graphics and ours into the storyline and overall concept of the shifting states of the mind. It was exciting to build visualizations for a piece where we had no idea what the final output would look like. Only at Yerba Buena’s opening night did we finally see the production in its entirety.

Even as collaborators on this project, we could not have anticipated how powerful and immersive the experience would be for the audience. Throughout the show, the crowd was encouraged to roam and take in all perspectives while dancers emerged from the dark, weaving through the crowds, alluring our eyes to the center of it all; the dance floor. As viewers and contributors, we were captivated by the complex idea, completely engaged with the performance, and left feeling like we just exited a scene from a Sci-Fi film.

The thing we valued most about working with Capacitor and all the creatives behind Synaptic Motion is how collectively, we took something difficult to imagine visually and made it tangible and stimulating.

Our Team on this project:

Wesley Grubbs: Creative Director
Anna Hodgson: Art Direction
Shujian Bu: Lead Engineer
Nick Yahnke: Engineer