Tag Archives: collections

Is Google Glass dead? I don’t think so, and here’s why

Subtitle: A short essay in which Kelsey proves she is an extremophile of Sci-Fi literature

People love to make predictions of success and failure. Star Trek is commonly cited as one of the first tv shows to predict the flip phone, ipad, and smart phone. ‘Back to the Future II’ predicted the hoverboard, Arthur C. Clarke first conceived of GPS, and Ray Bradbuy, in ‘Farenheight 451’ predicted earbuds, giant TVs, and mechanical hounds…and cats*. Compared to all that, digital glasses (a la William Gibson’s ‘Neuromancer’) seems like a slam dunk. Plus, Google is like the Pixar of the tech world. They’ve has had so many wins, how could they fail? Yet, two years and thousands of selfies later, the predicted revolutionary impact of Google Glass now seems to be going the way of the Segway.

Dronte_17th_Century_Segway

Restored 17th C. sketch of Raphus cucullatus by Dronte (Wikimedia)

However, like the Segway, Google has not tapped into the true market for the glass: the primary (raw materials) and secondary (manufacturing) economic sectors. Stay with me here and I promise this leads back to Paleontology. How handy would it be for an inventory screen to pop up in the right upper corner of a person’s vision? Or for a logging company to keep track of where and how they are cutting down trees? You could even keep a record of what each tree looked like before. A surgeon used the Glass to record his procedure, but in the future I can see EMTs sending reports and pictures of the patient to the hospital before they arrive, so the staff are better prepared.

No one would judge a person wearing Glass to inventory or save a person’s life. Instead of trying Glass out in the shower or at a wine bar, we as a society should focus on what technology can contribute to humanity, not how it can enhance a Facebook status.

“Really, Captain, I don’t feel silly at all wearing this…” (ST DS9)

Currently, the University of Texas at Austin’s Non-Vertebrate Paleontology Laboratory (NPL) has a team of volunteers using three Glasses to conduct a cursory inventory of their 3.5 MILLION fossils. Chase, one of the employees at NPL, calculated that it would take him 90 YEARS to catalog all the fossils currently at NPL the “traditional way.” The pictures produced by Glass are sharp enough that they are already being used for reference. It’s a damn good start.

If the 2015 Consumer Electronic Show (CES) is any indication, wearable tech is the way of the future, but a future that must be as useful as it is flashy. Gadgets can’t just be useful for the consumer market, the have to blend into the background of a “normal” life. However, in an industrial or scientific context, normal is shoved out the window in favor of innovative tech, and most importantly, gadgets that make people’s lives easier.

So, there you have it. Google Glass is not a Segway (which, by the way, has been adopted in large manufacturing facilities and by security companies), but a useful tool for the future.

And don’t forget: there are plenty of wrong futuristic predictions as well.

*And let’s not get into how accurate Aldous Huxley’s ‘Brave New World’ turned out to be. I’m just glad that fannypacks aren’t nearly as popular as he predicted.

Google Glass and paleontology collections: keeping a level head

Author: Kelsey

Today we continue on our Google Glass adventure! Check in here and here to see what we’re doing and why.

Confession time: At this point in time I know only a rudimentary amount of programming. This only becomes a problem when I get a brilliant idea (my Morse Code App will exist… one day) or Google Glass doesn’t do something I want it to do. Well, short of taking a crash course in shoddy programming, I decided to create a “physical app” to address my predicament.

The quandary: To take a picture with Glass, you say “Ok, Glass, take a picture”. The computer then beeps happily to you as it takes a ‘screen grab’ of your life. Now, our version of glass did not get the update to aim Glass’ camera before a picture is taken (this is a common complaint among those wandering Google Explorers). So, every time I took a picture, I found I that I tilt my head about 30 degrees to the right.

DrawerLean

Either that or we have a serious problem with our drawers

I’m not sure if this is because all the weight of Glass is on the right, or I naturally incline my head about 30 degrees, but I needed a way to keep my head level as I took pictures. When I was a kid, I used to play with my dad’s level, assessing the stability of my Barbie house and bunk bed (yep, I had a bunk bed/fort/space ship… on reflection I’ve significantly downgraded since then). So I decided to create a device to hold a small level in front of my left eye.

After a quick trip to JoAnne’s, I acquired a teeny level my kid-self would be jealous of. I raided the NPL supply closet and came away with tongue depressors, B-52 (an adhesive), and twisty ties. One hour later, it lived!

PatentedDesign1

Behold! Our product shot

Notice how far out the level is compared to the glass. That’s because Glass uses refraction to make the image appear farther away than it actually is. The level uses good old-fashioned corneal focusing. That day I achieved two goals: I provided the entertainment for the day by strutting around and I also managed to take level pictures.

20140815_113107_357

This represents a beautiful moment in my life

In the end, the device is more of a training tool than a permanent addition. Once I had the feel for what “level” was, I was able to remove the device and still take even shots with Glass. Then again, why would I want to take off such a classy addition?

 Level2

NPL: they tolerate me so well

Special thanks to Angie and Cissy for the photoshoot.

Questions? Comments? Leave them below.

Google Glass and paleontology collections: what we learned, Part II

Author: Kelsey

Recap: One of my projects this summer was testing Google Glass for the Nonvertebrate Paleontology Laboratory (NPL) here at the University of Texas Jackson School of Geosciences. We are interested in testing curation potential. After many rather artistic drawer shots and some casual photograph comparisons, I decided to suit up and get systematic. If you’re just joining us, check out Part I here!

Goals: My primary interest is in data preservation, so I’ll be focusing on the object and text resolution, including degree of pixelation, lighting conditions, and glare. I decided to compare Glass against another small, mobile device: the iPhone5.

IMG_2194

Setting: The set of drawers I decided to photograph houses a set of preserved insects (sadly, no DNA here). The external dimensions of the bottom cabinet are 59.5 x 71.2 x 121 cm (w x l x h). The average drawer width and length is 54 x 66 cm. The only light sources are overhead fluorescent bulbs. The temperature was around 85 degrees F (29.5 C) with a humidity around 60%.

GlassComp_iPhone_05

Not shown: cabinet of extra undergrads

 

Parameters: I am testing the “hands off” potential for Glass, so I only used voice commands and the touchpad when I had to. None of the pictures needed to be “shared,” just saved on the device, so no wi-fi or Bluetooth connection was necessary. I wore the glasses lanyard to prevent slippage (discussed in Part I). For comparison, I used the NPL’s iPhone5. Both tests weretimed and any label I removed from a bag to photograph for one trial, I would have to for the others as well. Both the outside (“Out”) and inside (“In”) were photographed.

Scoring System: The recording device with the the greatest object resolution and text resolution would be tallied for each drawer image. If there was no appreciable difference or both could work just as well, both were tallied for that picture.

TestComparison

 

Results:

Time: Glass 15 minutes 32 seconds, iPhone5 21 minutes 15 seconds

Table

Discussion

There you have it! Both the iPhone and Google Glass have good resolution to record object data (i.e. the fossils are recognizable). The iPhone outperforms Google Glass in text resolution, but Glass only takes about 75% of the time. This is only a pilot study with one trial (I know, but n = 1 sounds like a better and better plan as the temperature rises in the cages!), but it is very important in determining our next step. Mainly, we need a higher-resolution camera in Glass. At this point more information is lost to resolution, lighting, and glare than is made up for by the hands-free Glass experience…for now, anyway.

The pace of not only technology innovation, but technology adoption, is increasing. We could fear change and criticize the hiccups, or we can work to understand these emerging technologies and use them in novel ways no one ever thought possible. Personally, I love a bit of constructive criticism (it’s the only way I stopped being the “know-it-all” kid in high school), but too much negativity only highlights technology’s Orwellian uses. We expect Glass and similar devices will catch up with smartphones in no time. At that point we (NPL) plan to acquire a second Glass.

Our future projects include training volunteers, testing the screen projection capabilities, tagging images, linking images, app programming, and virtual field trips. Living on the bleeding edge definitely has its drawbacks, but this summer has been a fascinating experience and I can’t wait to see what is next.

Stay tuned for the next post in this series: how to deal with the dang drawer tilt.

Questions? Comments? Leave them below.

Google Glass and paleontology collections: what we learned, Part I

PFL16A_Glass

Author: Kelsey

Part I in the Google Glass series. Other posts: Part II.

The Lowdown: Google Glass has remarkable potential as a curation and documentation tool, but what it gains in efficiency it loses in picture resolution and lack of updates. Before we acquire our next one, we will wait for a newer version with a better camera, but we are stoked by this new piece of technology.

Background: This summer the Nonvertebrate Paleontology Laboratory (NPL) acquired Google Glass, version 2 of the explorer edition. NPL is part of the University of Texas Jackson School of Geosciences, where I go to grad school and study Australian agamid lizards in all their cranial kinetic glory. Full disclosure: I was working at NPL and had suggested to Ann (Curator and Collections Manager) earlier in the year that we try out this new technology. The idea of augmented reality or forehead cameras is not a new one—sci-fi writers have been heralding their coming for over half-a-century—but here was a chance to test a tangible piece of the future.

We had one “simple” goal at the beginning of our adventure: test the camera and video for curation potential. Ten to twenty thousand fossils are added to NPL every year. Only 1/8th of these are digitally recorded in our database. Inventory is a careful balance of speed and detail. Whole drawer contents and individual specimens are often recorded. Our fleet of staff and volunteers have begun using cameras, ipads, smart phones, and now Google Glass. In science (and, I suspect, academic institutions in general) simple goals often turn into reticulating fractals of fascinating sub-tests, sub-questions, and side studies. Fortunately, that’s why I got into this business.

Requirements: We are interested in devices that reliably and repeatedly capture images with a high enough resolution that all text in the field of view is readable and the fossils are individually recognizable. These photos would then be saved in our database for future research and inventory reviews

Stats: The Google Glass Explorer Edition comes with a 5MP fixed-focus CMOS camera capable of taking 2560 x 1888 resolution images. The fixed focus means the glass is set to capture as great a depth of field as possible and will not adjust, automatically or otherwise. Glass will tune the ISO (shutter and aperture controls) from as low as 60 to a high of at least 960. Videos are shot in 720p only. The aperture size of about f/2.5 with a focal length of 2.7 mm.

Start: I found taking pictures with glass is like switching from a go-cart to a normal car, you have to get used no longer aiming for the middle of the road. When you are wearing glass, the screen is above your right eye (NOT in front of it), and the camera lens is to the right of that, so you will have to aim your head at left side of the drawer while taking the picture. The camera app for our edition does not have an aiming feature, so getting the correct angle and resolution takes practice, patience, and intuition.

Observations: Hands-free is great! When I used the iphone to take pictures (more about that in a part II), I had to constantly put down the phone to move drawers or reposition specimen labels. It was incredibly handy to have both my hands free. Additionally, glass really is comfortable to wear.

OkGlassOk Glass, point the laser at…

However, I found I was never truly hands free. For every single picture you have to backtrack (the “swipe down” action) to the glass home screen (above) and ask it to take a picture. Glass saves all pictures, but immediately prompts you to share an image immediately after you capture it. This could be solved with a simple picture app that bypasses the social media features. Once I get my mad programming skills up to snuff, this is one of the first projects I’d like to tackle.

Pictures: I found the large depth of field meant I often underestimated how much of the drawer was in view, or I’d overcompensate and get WAY too close and personal with the fossils, which just resulted in close up shots of fuzzy fossils. Most of these can be solved with practice and the addition of aiming software.

AimingTroubles

Aiming Troubles

The camera is very sensitive to light levels and has no internal regulatory mechanism. Even a slight adjustment in head angle can make the difference between a dim or overexposed picture. Wearing a baseball hat or wide-brimmed hat does not help, only squishes the glass down to uncomfortable angles.

 TooBrightTooDim

This analysis? Just right.

I also had a problem tilting my head to the right. I suspect most people to not hold their heads perfectly upright, which results in a tilt to the picture. More on how I dealt with this problem in an upcoming blog post, “Keeping Level-Headed.”

Looking down to photographs low drawers caused the glass to slip, so I added a lanyard in the back. This was easy for one side, but the battery on the right necessitated a duct tape solution. The addition does not compromise comfort too much and helps really secure the glass to my head. Fashion may also be compromised to a certain degree.

GlassLanyard2

Google Glass: Nerd Edition

Overheating was also an issue. Our collections spaces (“the cages”) are, for the most part, not climate controlled and summertime Texas heat and humidity are high even early in the morning. However, even in the climate controlled areas, continuous use causes glass to flash a warning message after about 15 to 20 minutes. This is an oft-cited problem in the glass community, and comes as a consequence of clashing optimal operating temperatures.

Is the text readable? Sometimes. The smaller the text and the worse the lighting conditions, the more likely it is to lose data. Also, it’s impossible to check the pictures until they are loaded onto a larger screen. On the other hand, when I had to leave a project half-way through and wanted to pick up where I left off, I could simply review the most recent pictures or videos and quickly start where I left off.

 DocumentComparison

 If I never see this label again…

So, is NPL a victim of the Gartner Cycle  or are we pushing the boundaries of museum science? Once I had a handle on the initial pros and cons (hands free, trouble aiming, trouble with light balance, overheating, and resolution) we decided to conduct a more formal study comparing the iPhone 5 to Google Glass. See “What we learned: Part II” for results and my preliminary conclusions!

Questions? Comments? Leave your reflections below.