Skip to content

Offline mobile web app with SammyJS

Last week I went to the Great British Beer Festival (gbbf.org.uk) and to help me track the beers I wanted from the 900 or so draught and bottle beers and ciders available, I built a little web app.

The GBBF beer selector doesn't really work on mobile...

The GBBF beer selector doesn’t really work on mobile…

Great British Beer Festival do provide a beer selector on their website, but much like most organisations who make a web-thing for marketing, they’ve completely missed the point: people want to peruse the list while they’re *at* the beer festival, and their site is rather unfriendly to use on smartphones. What is needed is a mobile web app.

Last year a few of us had a go at this. My approach was to build a jQuery bookmarklet on top of their beer list, to allow the user to mark beers as “wanted”, “drunk” and “unavailable”. The advantage of the bookmarklet approach was that I didn’t have to make a copy of their data (and all the IP issues that incurs), and I was just augmenting what was already there (so hopefully would be less work). The disadvantage was that it required me to be online to access their website, and the jQuery to filter through their rather div-heavy DOM was slow on my Nexus One!

This year my aim was to build a small, simple web app in as few files as possible, so that I could download it onto my phone and use it offline (i.e. in airplane mode). This would involve getting the data to store in the web app (I have a lot of screen-scraping PHP code already) and then building a single page web app to let the user browse the beers and mark them as “wanted”, “drunk”, “unavailable” and make notes.

Getting the data

Unfortunately, the Great British Beer Festival are not an open data company (other than accidentally leaving an Excel of the real ales on their site). This means I had to screen-scrape the website. I’ve done a lot of this in some other projects, so just adapted the PHP code I already had. This involved reading every row of the table on the GBBF site into an associative array, then spitting it back out as JSON ready to be loaded into my JavaScript web app.

This took 3 or so hours the night before I was meant to be going to the beer festival. I was running out of time, but I intentionally wanted to restrict how long I could spend on this ūüôā I went to bed and decided to do the front end of the web app in the morning.

Single Page App framework

Despite¬†never having built an offline web app, I knew I’d¬†need a framework to handle the displaying of different “pages”,¬†because there will in fact only be one web page. There are a bunch out there, but I wanted something that I could quickly pick up and build something with in about 3 hours: that significantly reduces the field! One thing that annoys me about software frameworks is how ridiculously complicated they are to start with. You typically have to read several days worth of tutorials, which explain in excruciating detail all the quirks and inconsistencies of that framework (as if you’re going to remember any of it)!

Basically all you need for a simple client-side web app!

SammyJS Hello, World: Basically all you need for a simple client-side web app!

With PHP I spent a while trying Zend, CodeIgniter and CakePHP, all of which suffer from this flaw, before stumbling upon FatFreeFramework (F3), which is now my PHP framework of choice. I’ve recently been learning Django, which is a little bit heavy, but I’ve not investigated simpler frameworks yet (though I hear that it’s probably flask or bottle). Now was the time to find one for JavaScript.¬†My initial searches brought up things like AngularJS, EmberJS and BackboneJS, but as always the ones people shout about most are the ones they spend 40 hours a week working with in their job, so they know all the ins and outs. Eventually I found SammyJS, which had a Hello, World of only 9 lines of code, on its homepage, which instantly showed that it did everything I needed: handle a GET request, load data from a JSON, and render all items with a template. Perfect!

I first followed the first part of the SammyJS tutorial, which walks through loading from JSON, displaying items using templates, and then displaying other pages. I implemented what I was reading, but using my data rather than their example data. This quickly got me a base for what I needed: listing all the beers that are available.

I then added the ability to search by style. This was just the code for displaying all the beers, with an if statement to only display ones of a chosen style. Which style was defined by a URL route parameter. After repeating this for country, I realised that the code is the same for bar, brewery, country and style, so I created a JSON file for each of these and generalised the code to load the JSON file for whichever category is in the URL.

I wanted to make it look not-too-terrible, so went to put in jQuery Mobile to make it really look like a¬†mobile app. However, when I added it in, it started constantly reloading the root of the web server! It turns out that jQuery Mobile is more than just a set of widgets, it is also a framework for¬†single-page apps (if only I’d known)! Anyway, that broke everything, so I immediately got rid of it and dropped in Bootstrap, so I could give the lists of links and buttons a more user-friendly look and feel.

Finally, I had to add the functionality for letting the user select which beers they want to try, have drunk and any notes they want to make. I had written most of this last year, using jQuery and HTML5 localStorage, so it should have been a pretty easy job to port it over. However, SammyJS doesn’t seem to let you use regular jQuery, which is a bit of a pain. I ended up using regular onclick and onblur attributes in the HTML to call functions I defined in my JavaScript. This got the job done, but it was a bit disappointing to not be able to attach the events programmatically.

While I’m on the subject, HTML5 localStorage is really lovely. It is so much nicer to use than cookies, and just involves reading and writing values in an object that acts like an associative array called localStorage, and that gets magically saved by the browser between visits to the page. Simple!

I ran into another peculiarity of SammyJS when trying to display wanted beers as a separate list from unwanted beers, on the same page. I tried duplicating the loop, just with a different if statement in each. However, this still displayed some of the unwanted beers interleaved with the wanted beers. I think this is because some of the actions in SammyJS are based on asynchronous tasks (like loading a JSON file), so you’re meant to chain them using .next(). However, some of the SammyJS functions don’t seem to return an object that has .next(), so I was unable to chain in the way I wanted. A related problem meant that I found it difficult to insert text into the page before loading the JSON file. I’m sure I can figure out how to fix both these problems, I just didn’t have the time to do so during¬†the development of this app!

The last thing I did before running out the door to get the train to London was to put it online and test it briefly on my smartphone. It looked like it worked, and the only thing I needed to tweak was the CSS to increase the size of the text a little and set the viewport width.

<meta name="viewport" content="width=device-width, initial-scale=1">

Webserver on my phone

A web server on your Android phone!

A web server on your Android phone!

On the train, I tested whether I could save the app and use it offline. My very quick search indicated that in Android you could save a bookmark to the home screen for offline use. This did not work. I’d hoped that the (almost) single file aspect would mean Chrome would easily cache it and give me access to it, but as soon as I turned on airplane mode, Chrome immediately refused to even load any page (because it’s offline)!

I tried downloading all the files from the web server (luckily I had left a zip of the directory on the server), and using the file:/// protocol but this seemed to stop any of the JavaScript working.

My final chance was to download a web server to run on my phone and host the files locally.¬†There is a web server called kWS on the Play Store, which is free and did exactly what I needed! It’s also one of the few apps I’ve ever installed that wanted permission to access only 2 things (files and network, obviously for a web server). Once setup, with the files in /sdcard/htdocs/, I could access my app at localhost:8080. Brilliant!

Offline App Cache

The palaver with having to host a web server on my phone got me thinking: given a settings file at a certain location relative to the web app (say app.offline in the same directory as the home page), I could build an app that downloads all the appropriate files, and then hosts them locally on a web server on the phone. This seemed a little overkill, as someone must have thought of this before, so I hunted around a little.

And of course this is already a solved problem! I first found an Adobe blog about taking web apps offline, and once I knew the name of the technique (“app cache manifest”) I managed to find an HTML5rocks¬†application cache tutorial which explained a few of the nuances and an offline HTML5 tutorial that explained some of the tricks to debugging. It’s dead easy: create a manifest file which lists the files that are required offline (in my case index.html, app.js and a few .json data files) and reference it in the <html> tag of your pages. The only downside is that you do have to manually list every file you want offline (wildcards don’t really work – it can if you have directory listing switched on, but it didn’t work for me), but you could write a server side script to generate this if you have lots of files.

It’s quite useful to know that in Chrome you can go to¬†chrome://appcache-internals/ to see whether your app has been cached, and to clear the cache after you’ve updated your app. The console in Chrome Inspector is also helpful, as it tells you each of the files it is caching, and if it fails at any of them, gives you an error message.

Summary

There we go. In less than a day’s work I managed to¬†learn a client-side app framework, build an app to meet my needs and figure out how to get it to work offline. Hopefully, this will be useful practice for future app development! Of course, some of this has built on previous skills I have learned (PHP web site scraping,¬†jQuery and localStorage), but I’m glad¬†I managed to¬†find the tools to use quite easily, wrangle them into what I wanted and have the chance to share those tools here!

SammyJS is pretty lovely. Unfortunately it seems to become a bit of a pain to do things that don’t fit exactly in its view of the world, but it does do those basics elegantly enough that I’d like to use it again. localStorage is brilliant for storing simple information between uses of the app. The application cache is such an easy way to save your app offline and keep using it when not connected to a network. The bits for building offline web apps are all there, and they’re really easy to pick up. My advice is to start with these, and learn something more complex when you need it.

My code is at¬†https://github.com/rprince/gbbf2014-webapp in case you want to check it out (note you’ll need to get SammyJS as I’ve not figured out the licensing yet). You can check out a working version of it here:¬†http://users.ecs.soton.ac.uk/rfp07r/gbbf/2014/.

Advertisements

Thunder Run 2014

I have once again returned from Thunder Run. My blog post about Thunder Run 2013¬†was quite well received, and served as a reference when planning for this year’s for at least a couple of my friends, and myself.¬†I thought I should do the same again, but I’ll try not to repeat too much of what I said last year!

The event was, like last year, excellent. The overall setup was pretty much identical. The course was almost identical, give or take a slight adjustment at about 9k, to take you along the east of that bit of the campsite (rather than the west, which was a bit too boggy last year). The vendors by the start/finish were largely the same (Buff, Adidas, ice cream van, fish and chips, cycling accessories), with the addition of an “Alpine” food stall selling tartiflette, hot chocolate and mulled wine(!!). The staff in the main food tent seemed a little¬†green to start with, but were¬†fairly well drilled by the end of 24 hours of serving hungry runners!¬†I also heard that they ran out of jacket potatoes and pasta with meatballs in the middle of the night, but this didn’t seem to be an ongoing complaint, so I assume they rectified this.

The nutrition stand were there again, though with Osmo hydration and Honey Stinger energy bars (rather than Clif bars). They still did the unlimited refills of hydration powder, which was excellent, and there was even a choice of bottle this year ūüôā

parkrun

http://connect.garmin.com/activity/551377976

LRR Thunder Runners at Conkers parkrun (thanks to @runbeckrun for the photo)

The nearest parkrun to Thunder Run is Conkers parkrun. As parkrunners tend to like a bit of tourism (visiting other parkruns around the world), it’s become a bit of a tradition for parkrunners who are in the area for Thunder Run to visit Conkers on the Saturday morning. I went last year, but for some reason didn’t write about it in the blog post.

This year there was a much larger crowd of us (2 packed cars) which made for a very fun and social parkrun (we stopped for a lot of photos, especially with the sign-holding volunteers at Conkers). It’s a lovely park and well worth checking out if you’re in the area on a Saturday morning.

Lap 1

http://connect.garmin.com/activity/551377994

At 2pm, it was hot! I tried not to go too hard, as I knew I had lots of running left ahead of me, but probably went up the first hill a bit too fast, then definitely went down the first descent too fast!

Stopped for a drink at the water station, then continued running. Second half of the lap was probably a bit cooler (more tree cover provided more shade), but I was still slower, due to the drink stop and having gone out a bit hard in the first half.

Embarrassingly, I fell over on a completely flat piece of ground,¬†just past the 4 mile mark. Now it was through one of the forests, so I guess there might have been a tree root or something, but it didn’t feel like tripping on something that hard. I think I just got lazy with my feet, didn’t lift them high enough and clipped the front edge of a dip in the mud. One thing’s for certain: for the rest of that lap my feet were exaggeratedly high! The kind runner ahead of me stopped and called back to check I was ok; as I rolled onto my back and sighed, I yelled “I’m ok thanks” and realised I should get up and running. I overtook the friendly runner not long after and thanked him again: I had a little scuff on my left knee (I normally do, I’m a goalkeeper) and was covered in dusty mud, but other than that I was fine.

Lap 2

http://connect.garmin.com/activity/551378017

By 8pm, the weather had cooled a little, with a little more cloud cover. The organisers announced that anyone going out after 8.20pm had to wear a head torch. I took mine with me, but it was still light enough to easily recognise the course.

An absolutely beautiful shot of the Thunder Run campsite. (Thanks to James Saunders for the photo!)

This was probably my most comfortable lap of the weekend: not too hot, good light, I was beginning to anticipate which bit of the route was coming next and I wasn’t too tired yet. I kinda knew this was my best opportunity for a fast lap, as¬†I was less tired than I would be on Lap 4! I pushed a little harder than Lap 1, and did end up going faster by 45 seconds, though it felt harder than that. I’ll put this down to lack of endurance training (or any training)!

It started raining lightly towards the end of the lap, which was nice and cooling, but also meant my clothes that were drying on the gazebo got wet again!

Lap 3

http://connect.garmin.com/activity/551378037

My third lap was my only one in total darkness. Despite only about 4 hours sleep, I woke up pretty easily and prepared for the lap. It definitely took half a kilometre or so to get going. I’d¬†done some dynamic stretching, but should probably have gone for a little jog.¬†Head torch was good, didn’t have to adjust much. It was still quite warm (I wore a t-shirt rather than a vest). Uneventful lap, really.

Lap 4

http://connect.garmin.com/activity/551378064

Feeling fatigued by now. Chest felt tight (in a muscular, rather than breathing way) for the first half a mile, presumably because my arms and shoulders were tired.

Took the first few hills relatively easy, though was still overtaking.

Was exhausted by 5km, so decided to walk to the water station and then up Conti hill. From there I ran the rest, but not quick. It was not as hot as Lap 1, but I was just knackered!

It was quite a nice feeling to finish this lap, have a shower and get some food, knowing I was done for the day. I was pleased to have completed one more lap than last year.

Shoes

Last year, my major mistake was not taking a suitable¬†pair of trail shoes. I assumed the weather would stay good (because it was summer), but the torrential storm created a running surface that my Vibram FiveFingers Trek Sports just couldn’t handle.

I did a bunch of research into minimalist trail shoes and went to Thunder Run 2014 with much more appropriate footwear.

Fellow Brontophobic and blogger, Tamsyn. (Thanks to James Saunders for the photo!)

Firstly, I was given a pair of Inov-8 Bare-Grip 200 for Christmas. These have some proper lugs on them, almost the size of football boot studs, and¬†more of them! I ran at a couple of very cross country parkruns on New Year’s Day (Lloyd Park and Roundshaw Downs). I am very confident I would have survived last year’s Thunder Run with these. Unfortunately I didn’t get an opportunity to wear them this year, as it barely rained!

More recently, I bought a pair of Vibram FiveFingers Spyridons from Sport Pursuit. My mistake was that I did not make it to any CC6s or RR10s this year (mostly due to injury), so did not wear them in before Thunder Run. I ended up wearing the Spyridons for every lap of Thunder Run this year (I wore my Trek Sports for parkrun), so I ended up with a blood blister on my second toe, despite covering my toes in Vaseline. The Spyridons are a lovely minimalist trainer. While they felt a tad stiffer and heavier than my other FiveFingers, they had a bit more protection on the sole, so tree roots, stones and even bits of half-buried brick were no problem at all. Grip wise, I was happy enough as I had no problems with sliding sideways or down hills, though the ground was 95% dry for all of my laps. I think they would have coped with a bit more rain, but I have no evidence of whether they would have survived last year!

Nutrition

I followed a fairly similar approach to last year: cook lots of pasta on Friday night, and save the rest for mini-meals after each lap. I also had cereal bars, malt loaf,¬†peanut butter, bananas and jelly babies as energy snacks¬†in preparation for a lap. I never felt short of energy, though I think that’s more to do with lack of stamina meaning I never dared run too fast.

Also, as I was feeling a bit more comfortable with Thunder Run as a whole, I ate some of the catered food before and during the event (last year I wouldn’t have dared, as it was untested). A late night tartiflette, following my second lap, did a very good job of warming me up and filling me up ūüôā

Fitness and Performance

I didn’t train for this Thunder Run. In fact, I’ve barely been training at all. I still don’t feel confident that I’m over my injury. I seem to have got past the glute medius problem, though that might just not be being aggravated because I’m not running very much. However, my¬†the front of my left hip still gets tight when running, which I understand is linked to having very tight adductors. I try to stretch every day to improve this, but it’s slow and uneven progress.

Lordshill Road Runner soloists Jim and Rob.
(Thanks to James Saunders for the photo!)

My training recently has consisted of 1 x Run Camp, 1 x parkrun and occasionally 1 x another run each week. No long runs, no track sessions, no hill work. I want to get back into my training, but I’m worried about getting re-injured. Also, now I’m out of the habit¬†of going to certain training sessions, it’s hard to find where to fit them back in (even though I don’t seem to be doing anything particularly important with that time).

Nevertheless, I’m still happy with my performance at Thunder Run this year. All of my laps were under 60 minutes, and I did 4 of them. That was effectively my goal last year, which I failed at massively due to lack of preparation for the conditions! Now I’ve had a simple, straightforward, successful Thunder Run I can start planning ahead to a more daring goal next year. 5 laps? 4 laps under 50 minutes each? Who knows? I’ll see how training goes!

Reflection on tips from last year

My Thunder Run blog post last year had some tips for myself this year. Stupidly, I didn’t listen to¬†all of my own¬†advice!

I didn’t take 4 of everything to cope with all the laps. I had enough technical t-shirts and vests to have a new one every lap, but I only have 3 pairs of running shorts (so took them all), 3 appropriate pairs of shoes, 2 base layers and 1 towel. To be fair, the multiple shoes is only really required if it’s raining a lot, but in the heat my base layers and shorts got very sweaty, which meant trying to rinse and dry them between laps.

I didn’t take a clothes horse (too unwieldy to carry) and forgot to take coat hangers (I don’t have many spare). I hung things from the gazebo, but got caught out when it started raining while I was out for my 2nd lap and everything got wet again (it was almost dry when I headed out).

I didn’t take enough cash. There were many more Buffs that I wanted to buy, and they didn’t even bother bringing a card machine this year! I had enough for any food I wanted to buy, but I still felt like I was having to¬†be careful about what I spent, which is a distraction from running!

I also failed to acquire a camping chair, but there was nearly always someone not sitting at any one time, so there was often a spare. This is a bad thing to rely on though, so I’ll sort this out next year.

Tips for next year

These are additional tips for myself, as I failed to heed my own advice from last year and I wish I had done the following:

  • Take coat hangers
  • Take more cash
  • Get a camping chair
  • Get more base layers
  • Get more shorts
  • Get another towel
  • Go for a short¬†jog before each lap.

Of course, last year’s¬†tips still stand true:

  • Take 1 of everything that you need for each lap you hope to do. Ideally this would include:
    • base layer
    • vest/shirt
    • shorts
    • shoes
    • socks
    • towel
  • Figure out a way of getting things dry, even when it’s raining. Maybe a clothes horse or a few coat hangers if your tent is tall enough.
  • Have good trail shoes. Try them out in the CC6 races to make sure they’ll be adequate.
  • Cook up your night-before-race meal (e.g. pasta) and have enough left over to snack on after each lap. Other pre-race snacks (e.g. energy bars, cereal bars, jelly babies, peanut butter sandwiches) are handy as well.
  • Take some cash for food and gear at the site. Some of the vendors have card machines, but mobile phone signal was appalling so they didn’t work!
  • Camping chairs and gazebos are very useful.
  • A tent you can stand up in is great.

And some I didn’t think to add last year, but might be useful to someone who has never been before, or may not camp very often:

  • Some way of transferring water in bulk from the water tanker to your campsite. A couple of the 5L bottles from the supermarket work¬†nicely.
  • Get a good head torch, charge it well before going.
  • Have another torch for moving around the tent and campsite at night (so you don’t run out the batteries in your head torch).
  • An inflatable mattress is helpful.
  • A fairly light sleeping bag will do. It’s summer, so is still quite warm at night. Having a jumper, hoodie or towel close by can come in handy as extra warmth if it happens to get unexpectedly cooler while you’re sleeping.
  • Take your parkrun barcode #dfyb!

Next year I’ll rewrite this as a checklist.

Overall

Great Thunder Run. Thanks to all my Brontophobics team mates, and to my fellow Lordshill Road Runners. It was an excellent weekend away, and I can’t wait till next year! I’d recommend it to anyone who enjoys the social side of running, though I think it’s best if you can go with a team who have some experienced Thunder Runners to help you settle in.

WAISfest 2014

The last weekend of July 2014 was dominated by WAISfest, the research festival held by my research group WAIS. This year I was heavily involved in organising the event, after we managed to convince the academics to relinquish organisational control to the people of the lab, who make up more than three-quarters of the participants of WAISfest. I decided to blog about the event as a way of capturing the essence of the event, for future members of WAIS.

Day 1

Kick Off

WAISfest pitch coffee At 9.30am on Thursday, fuelled by pastries and coffee, 55 or so¬†members of WAIS arrived in a lecture theatre to kick off WAISfest. Our new head of group, Luc Moreau, gave an introduction to the idea of WAISfest and a thanks to the organisers and for everyone for taking part. Then I sped through a reminder of the schedule, pointing out when I expected people to meet back up, when the social events were, and when the opportunities for free food were ūüôā There were a record 11 themes this year, more than any previous year. Their titles were:

Charlie pitching Implicitly-crowdsourced sensing

Checking the pulse (Thur 2pm)

After I returned from lunch, I spent about quarter of an hour buzzing round all the groups I could find to see how they were getting on. Unfortunately a lot of them had disappeared for a late lunch. Visualising Impact – I had a chat with Will about the impact tracker his company has built, and whether it could be used to track WAISfest. It was a little bit late to be asking the question, now the event had already started, but he did give me access to the system, so I intend to test whether I can capture interesting information in it, with scope to get everyone to use it next time. Coffee Room – there were remnants of 2 groups in the coffee room. The crowdsourcing open data group had mostly headed off to manually gather some data, but I did find their plans on the whiteboard. The other group were the printing receipts for transactions where you exchange data for a product. At the time, they were having some issues because their thermal printer hadn’t been delivered yet and they couldn’t source one locally, but they were coming up with some novel solutions, such as printing to a web-connected thermal printer and watching on the web-cam! Linking accessibility data – seemed engrossed, so I didn’t bother them too much. They seemed to have a lot of data to be working with, so were busy! Location-aware narrative – this group seem to do well every WAISfest, and everyone looked busy doing something. I believe they’re heading out to build a narrative of the Southampton City Walls tomorrow, which should be exciting: field trip! Model Internet – working hard as well. They had calculated that actually modelling packets moving around even a small corner¬†of the Internet would be way too intensive, so have rescoped to¬†simulate traffic levels at various parts of the Internet, which should make the task more manageable. Recognising sites from afar – my group, so I know how we were going! We had gathered screenshots for the 50 most visited sites in the UK, to were discussing the effect of shrinking them and a protocol for doing an in-person experiment (can people recognise the sites on a screen from different distances). I didn’t really get a chance to checkup on the groups again because I was working on my theme, but I’ll do another summary after the Friday lunchtime status update.

Boardgames (social)

Thursday evening consisted of an opportunity to play some boardgames, an event organised by Jonny. There were about a dozen people there, so we split into 2 groups. I think the other group played Once Upon a Time and War on¬†Terror. Charlie, Matt, Jonny and I played Robinson Crusoe, which was interesting, though difficult. It’s a collaborative game (I like these a lot), but gruellingly hard. The premise is that all the players have been shipwrecked on an island, and you have to take various actions to survive (by getting food and shelter). We played one of the 6 scenarios (purportedly the easiest!), which required us to collect enough wood to build a big fire, and survive long enough for a ship to sail past (which was actually fixed at round 12). We lost because one of our players “died” in round 7, because the weather had turned inclement and we didn’t have enough food and shelter to help us through.¬†It felt as if the game was balanced a little on the difficult side, but it may also be that the amount of randomness (there is a lot of dice-rolling, as well as a lot of shuffled decks of cards to draw from) means¬†that it’s only possible to win unless you get perfect dice-rolls. Nevertheless, it was intriguing enough to make me want to play it again and¬†get closer to winning ūüôā WAISfest lunchtime update

Day 2

Quite a quiet day. I was busy running an experiment, so didn’t get a chance to float around the groups too much. We did have a lunch and status update session, which seemed to go quite well. We videoed it:

BBQ

Sunday of WAISfest weekend saw the annual WAIS barbecue! Stalwarts and contemporary members of the lab, along with their families, joined together to eat, drink and play. BBQ pit and people Chris' gazebo BBQ pit and people This year our venue was the BBQ pit between South Hill and Hartley Grove halls, which has¬†a huge open, grassy area proividing us¬†plenty of room for frisbee and football. With the addition of Chris’ gazebo and a couple of picnic blankets and camping chairs we had the perfect environment for a WAIS BBQ!

Charlie, Jon and Phil at the BBQ Disposable BBQs Smoky BBQ

Disaster almost struck, when it turned out that BBQ pit did not have a shelf for charcoal. Luckily, before the ravenous hordes¬†started tucking into raw meat, Chris produced a portable BBQ and Mike sped off to B&Q to pick up 12 disposable barbecues for ¬£24 (bargain)! Before too long, the frisbee pitch was filled with smoke and the smell of chargrilled food ūüôā Under the gazebo Salads on table Chillin' on the grass While the BBQ situation was being resolved, Yvonne, Pla and Priyanka handmade a fabulous selection of salads to accompany the meat. These were all amazing, so we must heartily thank these three for the delicious salad! WAIS BBQ: The End

Day 3

Narrative: almost done! Model Internet: almost done! Alan Walks Wales: almost done! The final day of WAISfest had a similar feel to the second day, with maybe a¬†soup√ßon of panic, as the 4pm wrap-up deadline approached. Just after lunch, I managed to do a short visit to all the groups I could find, to remind them about the wrap up. Everyone seemed reasonably calm though many reported there were things they didn’t have time to finish. MOOC Observatory: almost done! Almost empty coffee room Linking people: almost done! The wrap up was run fairly ruthlessly, with 5 minutes per group (getting warnings at 2 minutes to go and 30 seconds to go). Each group reported back their successes and findings, as well as their experiences of attempting their task and¬†how they might continue this work in the future.

COMING SOON: In another blog post, I will post videos of the wrap ups with a little summary of each.Whiteboard walls Internet graph

Immediately after the wrap up finished, everyone headed back over to the Building 32 Coffee Room for pizza and an opportunity to discuss each other’s results (as there was no time for questions between presentations)! Lots of talking¬†went on here, and everyone seemed in good spirits, so I think overall WAISfest did a good job of letting our researchers work on something a little different and engage in conversation with¬†some new¬†people. In my mind at least, that’s a success!

My response to “What Happened to Video Game Piracy?”

I read this article called “What Happened to Video Game Piracy?” on Communications of the ACM, and was a little disappointed that it seemed to miss a couple of key points, so I have addressed them here.

For a start, due to technical limitations music was first down the well, and the games industry learned from that. In 1999, when Napster launched, home internet speeds made it barely feasible to pirate music, let alone a video game. This meant by the time home broadband was viable, the games industry could learn from the mistakes of the music industry.

The younger games industry also seemed more receptive to new business models. id had already demonstrated shareware still worked in the 90s with DOOM, following on from that distribution model being popular in home PC gaming in the 80s. The free and open source movement, the sharing of game code through magazines in the 80s and the culture of demos embedded in the industry the mindset that money could be made despite giving something away for free.

The music industry couldn’t grasp this and rallied too hard. Although iTunes was released before Steam, it was under the restrictive control music companies demanded. It was not the product consumers wanted, at a price the market considered fair (non-physical MP3 albums costing more than the CD!!). Steam did the opposite: it offered gamers what they wanted, at a competitive price and the service continually improved (Steam sales, pre-order downloads, achievements).

This is not to say that piracy does not exist in the games industry. The market is split: consoles have little piracy, because modchips were made illegal and the industry clamped down on them; the PC market still has elements of piracy and this is unfortunately affecting companies that only release on PC. If the music industry had a continually improving technology, then perhaps most people would listen to music on their equivalent of a games console, and they could control piracy better. But it’s not, it’s a static technology. Playing of music has not significantly changed since the MP3 became popular over 15 years ago, and it is unclear whether the music industry (rather than the tech industry on its behalf) are investing anything to change this matter.

In effect, the answer isn’t solely that there’s consoles, a different market and some legislation. The industries themselves have taken different approaches to handling the problem. The games industry embraced downloading as a distribution method, rather than fear it, and gave gamers what they wanted sooner than the music industry did.

The Cathedral and the Bazaar

The Cathedral & the Bazaar: Musings on Linux and Open Source by an Accidental RevolutionaryThe Cathedral & the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary by Eric S. Raymond
My rating: 5 of 5 stars

The Cathedral & the Bazaar is absolutely fundamental reading for any computer scientist that wishes to have an anywhere near reasonable discussion about the state of software development today. It should really also be required reading in management, entrepreneurship and politics, as it outlines some interesting human motivations that, if embraced, could do great good for the world. The open source model of software development should not be feared or abused (as the immediate human nature responses appear to default to), but understood, respected and aspired to in all areas.

This paper copy of the books is a collection of essays by Eric S. Raymond, which fit together rather fluently. Each essay is available for free on Raymond’s website, but I think reading any in isolation would make far less sense than the whole, which this book presents. “A Brief History of Hackerdom” nicely sets the scene, “The Cathedral and the Bazaar” presents the central thesis explaining why the open source model produces better software than the closed source model, and “Homesteading the Noosphere” explains the human motivations behind the entire culture (relating it back to John Locke’s philosophy on property). “The Magic Cauldron” discusses the novel business models that are now possible due to open source software and why they are more sustainable than those relied on by closed source software, while “Revenge of the Hackers” explains how Raymond went from regular open source contributor to spokesperson for the community. These only make sense in the context of each other, and each one piles on more convincing evidence to support the model. You must read them all, because they are worth more than the sum of their parts.

Being around real hackers, open source developers, computer science academics, or even reading the tech community news gets you part of the way to understanding the culture and ethics behind open source. However, Raymond has thought so hard about this, lived through it and communicated it so succinctly that it is a crime not to read this so you can fully understand why an innovation in a relatively small technology community is going to have long lasting and important effects on the whole of society.

Raymond’s ego rises to the surface on occasion, but forgive him that and look past it into the principles of these essays. If you have any interest in software development at all, you must read this. I am frankly embarrassed that I haven’t read it before now, so don’t make the same mistake as me!

View all my reviews

The Invisible Man

The Invisible ManThe Invisible Man by H.G. Wells

My rating: 4 of 5 stars

The Invisible Man is the story Frankenstein would have been if Mary Shelley had been a more experienced author. HG Wells masterfully leads us through a mysterious build up, an action-packed middle and a treacherous finale, all in less than 200 pages. Much like Frankenstein, it is a story which I have seen adaptations of throughout popular culture, but only just got around to reading the original. It is refreshing to read the source material for these much used tropes.

There are few characters in The Invisible Man. Everything revolves around he whom the book is named after, but part of the enjoyment is the slow exposition of his history and motivations. Very few others are introduced in any detail, but that’s fine as they are just there to move the story along. The aim here is to keep the reader guessing for as long as possible and The Invisible Man does this very well.

Definitely worth reading, it won’t take long and you will enjoy it. Solid HG Wells and a much better mystery, horror sci-fi than Frankenstein.

View all my reviews

The Mote in God’s Eye

The Mote in God's Eye (Moties, #1)The Mote in God’s Eye by Larry Niven

My rating: 3 of 5 stars

The Mote in God’s Eye is an incredibly well written sci-fi about 4th millennium humanity making first contact with aliens from the a planet that is described as the “mote” in Murcheson’s eye, a stellar system discovered by it’s namesake. Colloquially the aliens are known as Moties. After intercepting a probe ship assumed to be from the Motie system, the human Empire launches a mission to investigate, which we follow as the main story. I’ll leave you to find out what happens next by reading the book!

The story is well written, remaining clear throughout. Even details which you might have forgotten are skilfully reminded as needed. It investigates a number of ideas surrounding the potential differences of the Motie species, society and history such as asymmetrical physique and communication far faster than humans. While it fulfils the common fantasy of the aliens we eventually meet, far in the future, as being hyper advanced, there are also aspects that make the Motie species inferior to humans. But the details of that are crucial to the story so I won’t spoil that for you.

The Mote in God’s Eye is entertaining to read, but overall is not massively inspiring. If there were any use of the contrast with an alien species to make comment about the state of humanity, then it was incredibly subtle. Sadly it felt like the ending was a little rushed; having spent 500 pages setting up the entire scenario and building a complex universe, heroes and anti-heroes, the human Empire’s decision about what to do about the Moties felt like it was based on unsubstantiated guesses, which we as the reader knew were right (having read the sub-plots), but which the characters had no evidence for. I kept waiting for the bigger twist, but the finale was fairly straightforward.

I definitely enjoyed reading this book, though it doesn’t stick in my mind as something I would love to go back and read again. I don’t think I missed anything, and I don’t think another read-through would help me consider a different perspective on the world. I would recommend it though: read it, enjoy the aliens, enjoy the action and enjoy the prospect of first contact!

View all my reviews

Calculating Readability in R

I’ve finally got round to exploring an idea around readability, and was excited to find out the programming language R already has a library that will calculate a number of readability metrics. Should save me some time writing my own one or using an API. Having installed this library: install.packages('koRpus') I was hoping it would be as easy as calling the function and giving it some text: readability("hello my name is Rikki") Of course it wasn’t going to be that easy. Here’s my guide to the minimum you have to do to get a readability score out of R. There’s plenty of other options to explore, and feel free to ask questions in the comments below.

  1. Install the koRpus library: install.packages('koRpus')
  2. Install TreeTagger. There are installation steps on that site. There are too many (i.e. it could be simpler), but go with it.
    1. Choose somewhere sensible to put the directory (I put the files in /usr/bin/TreeTagger/ on my Mac).
    2. Download each of the files it tells you to: tagger package, tagging scripts, install-tagger.sh and a parameter file for the language of the text you will be analysing. I didn’t download the English chunker file yet (I’ll see if it’s necessary later).
    3. Don’t unzip the archives.
    4. chmod u+x install-tagger.sh
    5. ./install-tagger.sh
    6. Add $TAGGER_PATH to your PATH variable as well (in your ~/.profile or ~/.bash_profile) and source ~/.profile export TAGGER_CMD=/usr/bin/TreeTragger/cmd export TAGGER_BIN=/usr/bin/TreeTragger/bin export TAGGER_PATH=$TAGGER_CMD:$TAGGER_BIN
    7. Test echo 'Hello world!' | cmd/tree-tagger-english
  3. Set up your TreeTagger and readability options in R: set.kRp.env(TT.cmd="/usr/bin/TreeTagger/cmd/tree-tagger-english", lang="en")
  4. Write your text to a file: tf = tempfile() write(words, tf)
  5. Run the readability function: rdb
  6. Get a value out: rdb@Flesch.Kincaid$grade

There we go. Way more complicated than it needed to be, but that’s how you do it. Install an application that the R library interfaces with, write your words to a temporary file and then call the function. Any questions, pop them in the comments below!

Thomas Was Alone

This game has been sitting in my Humble Indie Bundle library for about 9 months, and having completed OlliOlli and struggling with Guacamelee on hard mode, I was tempted to buy it for my PS Vita. However, I thought I’d try it on my Mac, having already paid for it, and see how I like it.

Well, the first half is pretty disappointing. The visuals are cute, with each character depicted as a rectangle of primary or secondary colour, against a black background with white levels. The game is made up of 10 (zero to nine) sets of 10 levels, and the first 50 all feel like they’re either training levels or introducing a new mechanic. The only challenge comes from having to move the characters to their goal (indicated by a white-outline rectangle of the same size and shape as the character) a much longer distance than necessary, though this requires more persistence than skill.

Screenshot 2014-03-30 21.01.56Adding to the frustration are a few oddities of the physics and collision detection, meaning that if you jump next to a moving block, you get attached and pushed along by it until it slows down (as if the moving block has coefficient of friction high enough to counteract gravity)! I also find the controls for switching between characters unintuitive, which means I occasionally move the wrong character to their death. Not much of a problem when there’s 3, but a bit annoying with 7.

Luckily, Thomas Was Alone does a very good job of not making death irritating. The character just respawns either at their starting point or at the most recent respawn point you have passed through. This is quick and automatic, which is important for fluid play.

I’m led to believe the point of Thomas Was Alone is the narrative. Again, while stylistically the narration is excellent (every level there a few lines of dialogue, which are delivered by Danny Wallace), I’ve found the story fragmented and not particularly entertaining.

Screenshot 2014-04-01 00.10.32The game picks up at the midway point, when James is introduced, with an interesting enough variation of the existing mechanics to make levels a little deeper. Finally there is a unique skill that requires some thought to solve the puzzles, and meaningfully depends on co-operation between the characters; at this point it almost feels as if this game could be multiplayer co-op. Scenario 5.10 is a beautiful example of this, though I fear is a one-off and too short-lived.

The second half does pick up a little, though it’s still only about half of its levels that present a challenge. The narrative also begins to make sense, and there’s a few more cultural references. There is also a level (7.2) where the goals are all moving, and at certain points through solving the level (which takes a little planning, but still isn’t hard), the beeping of rectangles meeting their goals makes a beautiful sort of bleepy music ūüôā

At the end of the day it is rarely more than a Towers of Hanoi puzzle with secondary colour rectangles, with a slowly revealed simple story narrated by Danny Wallace. However, there are a few moments where Thomas Was Alone’s promise shines through, moments where co-operation between the different characters is crucial to solving the puzzle. It is for these few moments that it deserves recognition, it’s just a shame there isn’t just 25 great¬†levels, instead of 5 brilliant ones and 95 fairly average ones.

Usability Testing with Google Hangouts

I want to implement Steve Krug’s budget usability testing lab where I work. It is described in detail in his book Don’t Make Me Think, but in summary it involves sitting a user down at a computer, getting them to use your website or software to do some predefined tasks, all the while streaming the video of their efforts to¬†the developers in another room. Krug suggests you do this using a relatively cheap camcorder and some cables to transmit it to the other room.

I want to do this, because I feel the software we procure, and sometimes the software (largely websites) that we develop, seriously lack in usability, purely because we don’t make it an important part of the¬†purchasing or development process. I want to make it so easily available that we have no excuse for not doing it.

While I do have a camcorder to hand, my initial plan was to use screen capture, webcams and video streaming to make this an even lighter weight endeavour. It would also be nice to record the entire thing to watch back later.

My first stop was Google Hangouts. I knew for certain that Hangouts allows webcam streaming and screen sharing. However, I wasn’t sure how well it would work for this or if I could do both at the same time.Hangouts SussedIt appears that you cannot stream webcam and screen share at the same time, which is a shame. It might be possible by using two Google accounts, but that seems a little overkill and would make the setup tricky.

The big killer is how slow the visual response is. Moving around on the source computer results in very laggy movement on the viewing computer. I don’t think this will give a good enough experience for the developers watching, and will reduce the likelihood that they will use it.

Unfortunately, I don’t think Google Hangouts is the right solution to this problem. As it is free and relatively easy to setup, I was hoping it would allow me to rapidly get this up and running, but it looks like I will have to try something else.

Suggestions of other software that will stream and record screen captures, webcam and audio in the comments below would be much appreciated!