Posted on October 20, 2010
PROD is a project directory-monitoring tool for JISC funded projects; and used within JISC CETIS to aid in programme support. I feel that although the tool holds a great deal of interesting information on projects it is sometimes hard to convey that information to people who do not use the tool on a daily basis. I have been wondering how it might be possible to help disseminate some of the information in PROD to a wider audience. I thought I would start by taking a programme that has a rich array of information in PROD such as Curriculum Design and see how it might be possible to make the information more interesting.
One of my experiments was to try and get PROD to generate something visual for users to click and explore. Below is one of my first attempts at getting PROD to generate a mind map for the programme entries in PROD. You should be able to click and drag around the map to get a richer picture of the programme and embed this Google gadget in your own web pages. I think the interface is a little hard to use so you may want to download the XML that can be imported into your own copy of freemind or view the mindmap in a separate window. I hope to tweak the map to include items such as hyper links out to relevant projects and information.
Finally I tried to use the graphviz set of visual rendering tools to show relationships between the different projects in the programme. I haven’t attempted to tidy these up or make much sense of these yet. You can click to get the larger version/
Posted on September 15, 2010
The Village Pump is an information hub for activities relating to the Flexible Service Design programme. The hub is a go to place for FSD related articles, events and contact information. Articles in the hub are aggregated from sources using their RSS feed and currently we gather information sources such as blogs, JISCMAIL lists and forums.
We would like to populate the hub with as much useful information as possible so if you have any suggestions on sources you would like to see aggregated in to the village pump or if you are running a project blog and would like its posts to be aggregated then please comment on this entry with information and RSS feed.
Posted on July 19, 2010
I recently read a blog post about intrinsic and extrinsic rewards. I’ve come across and heard a lot of talk following a similar thought pattern to this blog post, the common theme being that extrinsic rewards can be counter productive with a suggestion that we could learn from video games intrinsic reward system.
But increasingly video games have both intrinsic and extrinsic rewards. How many hours did you spend trying to get the gnome to the end of left 4 dead 2 so you could wear your virtual Depeche Mode tshirt on xbox live. How many of us put our gamerscore or trophies on our forum post? Games are getting increasingly tied into the social networks that surround content delivery systems and as a result many video game rewards are not impacting on the game itself but give a little something extra to show off in these networks.
Posted on July 15, 2010
The Cloudworks team have been developing an API for Cloudworks to allow developers to create their own visualizations, programs and mash ups. The API currently supports calls to get data and I found this is a great opportunity to test different ways of visualizing and organising the resources within cloudworks for the Design Bash 2010. Sheila MacNeill has created a cloud to store and discuss Cloudworks API tests, my test demos can be found there and the cloud works team are encouraging developers to get stuck in and have a go.
If you would like to play with the API yourself and add your own demos you will need to get an API by signing up to Cloudworks and contacting Nick Freear. There is some excellent documentation on using the API already made available by the Cloudworks team and Nick was happy to provide example code, his PHP example can be seen here (don’t forget to change the user agent/api key/cloud id):
Edit: Removed the code since the best place to grab a PHP example is Nicks snippler.
This should get you up and running and is pretty self-explanatory. I hope to turn my demos into something more useful and post some specific examples of things that can be achieved using the Cloudworks API.
Posted on July 8, 2010
I have recently been reading an excellent book by Andrew Pickering entitled ‘the cybernetic brain’. The first half of the book gives an incredibly rich and fascinating history of early British cybernetics and although I have not yet finished the book I am really enjoying the stories behind early cybernetic thinkers and the extraordinary range of backgrounds from which they come from. From the history in the book and by recognising the common themes between the approaches to different fields I have come to understand cybernetics takes a ‘steering’ approach to achieve its goals; presumed knowledge takes a backseat while understanding and reacting to a unpredictable and complex changing environment moves to the front.
I have started to think back to some of the work I have done previously and how I might have approached it. In University a big interest of mine was network security and I had spent plenty of time on assignments using products such as Snort and writing code to detect predetermined malicious patterns in network packets. These tools would detect malicious activity based on rule sets. The tool would spot something defined within the rules and report it back to the user.
The methods I used required the system to have an indefinite amount of knowledge to constitute as intelligence. To create the rules we must already have an understanding of what is going to happen within our environment, which we do not and will not have. Would an approach derived from shared agreement on what network activity is eventually teach itself to be secure? How would we move from a rule-based system to a self steering one.
I think I am starting to understand why the early cybernetic thinker came from such a wide array of areas…. and also why they were all mad!
Posted on April 28, 2010
I have had a few requests for help on deploying the Content Transcoder on a local machine. The process is simple but has one or two gotchas. I have done this on OS X and Ubuntu but it should be pretty similar on Windows.
You will need:
Tomcat 6.x As far as I can tell it must be 6+ since the transcoder doesn’t seem to play nice with the XSLT processor packed with earlier versions.
Set up Tomcat
1)You will need to unpack tomcat; I unpacked mine to the desktop.
2)Set permissions on Tomcat directory; I opened a terminal (/Applications/Terminal) and typing something along the following:
sudo chmod -R 755 /Users/david/Desktop/tomcat/
You may also need to set the environment variable JAVA_HOME. On OS X you may find this guide helpful. On Windows Right click computer->properties and go to the Advanced tab. Select Environment Variables and add JAVA_HOME setting it to point at the JDK. i.e c:Program FilesJavajdk1.*
Deploying the transcoder is as simple as placing the war file into your /tomcat/webapps folder. Tomcat will do the dirty work for you.
Tomcat can then by run running the startup.sh script or startup.bat for windows; in your terminal this can by done by something like this:
Tweek Transcoder configurations
You may want to make some changes to how transcoder is set up. You will need to change the configuration to use Apache Derby instead of the default MySQL. This can be done by editing the file at:
This should be a text file with two sets of properties with the bottom set commented out, you need to switch these around so the files look like this:
You may then wish to stop and start tomcat:
On windows you are looking for startup.bat and shutdown.bat
Now you should be ready to go simply by pointing your favorite web browser at localhost:8080/transcoder
Help its not working!
I see tomcat at localhost:8080 but nothing at localhost:8080/transcoder.
If you didn’t rename your war you will need to navigate to localhost:8080/transcoder-web-****
I see transcoder but it won’t convert packages.
If you changed the properties above and still have no luck the chances are that your JAVA_HOME is not set correctly. Google and a bit of tinkering should sort you out.
Posted on April 6, 2010
Demon’s Souls is currently my favorite game and although I have not progressed very far in the game I have poured many hours in to the first few zones. I have created many different characters setups, explored early dungeons extensively, studied enemy locations and pinpointed their weaknesses.
Typically I am the sort of player who quickly plays through a story never to touch the game again, so what makes Demon’s Souls different?
When you first play Demons Souls you might be put off by the crippling difficulty. The enemies are tough and although you will die frequently the game is fair and wants you to learn. Each battle will teach you new tricks, explore the dungeon so you find shortcuts and advantage points; an ingenious online mode lets you watch how fellow warriors have fallen. Slowly you will gain insight; the level will seem slightly easier each attempt.
The true genius of Demon’s Soul’s is the shift of experience from the player character to the actual player. Grinding in Demon’s Souls is all about growth in you rather then your digital representation. Don’t expect to be pressing X for 4 hours against rats here.
I could sing the praises of Demon’s Soul’s all day but instead I suggest you buy a copy and experience it for yourself. Unfortunately the game has yet to be released in Europe but it’s quite easy to pick up an US import copy.
Posted on January 7, 2010
Here I hope to give a quick start on how to use Schematron with your XCRI-CAP documents to give useful feedback.
The instructions here were originally tested on OS X but should also work on Windows or your Linux disto of choice providing you have Java. If there are any differences I am not aware of please leave a comment.
Setting up your environment
You will need:
• ISO Schematron
• A Schematron Schema
• An XLST processer
• An XCRI-CAP document for validation.
We’ll be using ISO Schematron, which is available for download from the schematron website: The file we want is iso-schematron-xslt2.zip, I recommend unzipping it to your desktop and renaming it schematron. Place the XCRI-CAP document you wish to validate into this directory.
The schematron schema is transformed into a XLST stylesheet using a XLST processor; in this example I am going to use Saxon-B for Java, which is an open source XSLT processor.
Once downloaded you will need to add saxon.jar to your classpath. On OS X 10.5+ you can place it into your /Library/extensions/Java directory.
A Schematron schema is needed validate against your XCRI-CAP XML document. The schema takes the form of xpath expressions; I have written one based on XCRI-CAP 1.1 which you can download here; feel free to use and modify. Once downloaded place into your schematron directory. You will also need this list of postcodes.
Now you should now have a directory named schematron with multiple files; it is important that you have the following five:
• your own XCRI-CAP document!
Validating your XCRI-CAP documents
First we need to create an xsl stylesheet which will be used as the validation engine against our XCRI-CAP document.
Open a terminal/prompt navigate to your schematron directory and issue the following command:
java net.sf.saxon.Transform -o xcri.iso.sch.tmp.xsl -s xcri.iso.sch iso_svrl_for_xslt2.xsl
This will use the Saxon processor we installed earlier to compile the schema into an XLST stylesheet (xcri.iso.sch.tmp.xsl) using our schema, Schematron xsl and the Saxon processor. The next step is to use the file we have created against our XCRI-CAP document to create an error report. To do this run the following changing ‘myfile.xml’ to the name of your document:
java net.sf.saxon.Transform -o errors.report.xml -s myfile.xml xcri.iso.sch.tmp.xsl
You should now have an error report called errors.report.xml. The report is in SVRL format which is a simple language defined in ISO schematon. SVRL can be used as the basis for further transformations and to demonstrate this I have written a small xsl which you can use to transform this into an HTML document. If you wish to do so, download to your schematron directory, make any modifications you require and use the following:
java net.sf.saxon.Transform -o errors.html errors.report.xml svrl_transformer.xsl
You should now have the HTML document of errors in your schematron directory errors.html
You should now be producing feedback on your XCRI-CAP documents and producing a simple html document of errors. If you are using a version of Saxon that allows access to its extensions then you we can use the saxon:line–number() extension to report line numbers.
To do so replace these files in your schematron directory with these modified versions:
And run though the validation steps again with the -l flag set. eg:
java net.sf.saxon.Transform -l -o xcri.iso.sch.tmp.xsl -s xcri.iso.sch iso_svrl_for_xslt2.xsl
java net.sf.saxon.Transform -l -o errors.report.xml -s myfile.xml xcri.iso.sch.tmp.xsl
java net.sf.saxon.Transform -l -o errors.html errors.report.xml svrl_transformer.xsl
Don’t have time to give it a go? You can use the ‘beta’ web based validator that will validate your XCRI-CAP document using the same steps shown in this post. You can find the validator a:
Comments very welcome!
Posted on March 30, 2009
Digital distribution is everywhere; applications such as iTunes provide the ability for digital products such as MP3s, movies and computer software to be delivered to audiences over the Internet instead of using physical media such as CDs, DVDs or Blu Ray. They provides easy and direct sales to a global market. With iTunes and the App Store ‘Apple’ may be the company that comes to mind when digital distribution is discussed but it shouldn’t be forgotten that plenty of video game consumers have been using these systems for years and recent announcement at this years Game Developer Conference 2009 have really shown that there are plenty more exciting developments to come.
Over the past 5 years gaming has seen a massive rise in digital distribution systems; many customers have been more then willing to make the switch from obtaining a physical copy of computer game software from a ‘bricks and mortar’ shop to downloading it through through distribution systems such as Steam, Impulse, Xbox Live Arcade and PSN. Some of the advantages distribution systems have over their convention counterpart include:
- Instant user feedback
- Anti-cheating Systems for online games
- Auto patching
- Downloading purchased-content from any location
For myself the big draw was (and still is!) the last bullet point. The idea that once I bought a game I could download it as many times as I wanted; even if I buy my product from a ‘real’ shop, the first thing I do is enter the serial code into a system as a backup, just in case it gets lost/snapped/broken.
Since I started using such systems when Steam first launched in 2003 there have always been two questions for me. The first is is how long will be before we no longer need to download the game? When can we stop buying into the expensive CPUs,GPUs and PPUs that games require and let all the processing be done server side? A recent announcement this week and ongoing work by Valve suggest it might be closer then we think.
The second was how digital distribution systems could move into different markets. In the UK most gamers will have broadband connections and a 7th generation console(or PC) since we are constantly after that new game and are a easy target for publishers; but what about markets that don’t buy into the latest consoles and games? Brazil has a massive gaming market, but one quite different from the situation in the UK with older consoles such as the Master System still seeing re-releases as late as 2006. Another exciting announcement at GDC 2009 saw a console designed exactly for such markets, pushing digital distribution as its method of obtaining games sales.
Moving into the Cloud
This week at the Game Developer Conference 2009 it was announced that LiveOne is a game distribution system that promises to take the load away from your computer and onto the cloud; allowing resource hungry games to play on modest hardware.The LiveOne developers maintain that the main bottleneck is bandwidth, with lower bandwidth users simply being met with a smaller screen resolution. This is really exciting news for gamers; does it mean it mean that digital distribution and cloud computing will kill the console/PC spec war, will we no longer go through generation after generation of video game consoles?
Although it would seem that Valve don’t think we are ready for such a radical shift they are still moving in a similar direction with their product ‘Steam Cloud’. Although Steam Cloud still delivers the game to the end user via a full download the idea is that variable data such as save games and settings are stored in the cloud meaning users can log on from any terminal with the game installed and carry on from where they left off.
Expanding the Market
Tectoy announced they would be attempting to push digital distribution into ‘The Next Billion’ Market by creating a console that will sell and distribute games through 3G or Edge networks using a virtual currency not unlike Microsoft or Wii Points. The seems to be no shortage of publishers wanting their games on the system; and a quick scan of the games that will be available (Crash Nitro Kart, Quake, Sonic Adventure to name a few) would make it seem that these publishers are egar to bring their old games to new markets; with digital distribution being the ideal means to do so. I find it fascinating that the company have decided that a decided to enter these markets via the digital distribution route.
It is no secret that there is a huge amount of money in games and this is the driving force behind these incredible innovations. As always though the technology will filter down and hopefully we will see the technology in other areas. Could application processing in the cloud mean we see an end to the PC CPU/GPU spec wars forcing PC manufacturers to focus their initiative in others areas? Will it mean that high-end programs will be able to run on your mobile phone with a small client purchase simply being made over 3G/Wimax etc? I’m sure there are plenty of exciting discussions to be had at future CETIS Cloud Computing and Institutions Working Group meetings!
Posted on March 6, 2009
Every year I pay £40 to microsoft for the privilage to play my Xbox Live games online. I dont mind paying the money too much considering live is far ahead of PSN (although I think sony are closing the gap) . When I spend £40 on a game that says I can play it on Xbox Live I expect to be able to play the game I have purchased for quite a while; I understand that eventualy I won’t be able to:
A) The popularity of the game will dwindle; meaning I can no longer play whacked simply because there is nobody to play it with.
B) Not enough people play to justify server support and it is pulled, sometimes I am ok with this (as with PSO, which I played for free for yonks on Sega servers) and sometimes it pisses me off (EA decide you have to buy the 2009 version).
Quite often games will give you the option to enhance your online gaming experience via micro purchases, map packs, skins, weapons; and this is fine, sometimes I buy the upgrades (although don’t get me started on recent Live and PSN content price increases), I own all the Guild Wars packs, and I will certainly be getting anything Valve release for Left 4 Dead. These are two of my favorite games which I will choose to upgrade; however having payed £40 for the original game if I choose not to upgrade then I should be robbed of nothing, gameplay should carry on as normal- just without the optional extras. I repeat THE OPTIONAL EXTRAS!
Yesterday I decided to play Halo 3; a game I don’t really like on multiplayer, but I did want to have a laugh with some old friends playing it. Having already paid £40 for the game and my £40 subscription fee, that shouldn’t be a problem. Right?
After making my way through the menu with that god awful music I am told I can’t play it anymore because I haven’t paid for the optional extras. That’s right, the OPTIONAL ones. I’m all for additional content for games, but not when I have to carry on buying them to keep the original functionality. A quick Google search just returns fanboys telling me to shut up and buy it, since its only 600 points.(although you cant even buy ms points by the 600s).
This isn’t the point. I shouldn’t have to carry on buying content to keep original functionality; I don’t mind additional content if its optional and is a reasonable price. Take LBP for example, I don’t own the costumes that cost £1.49, and what do I lose? Nothing.
Shame on you Bungie and Microsoft. Shame Shame Shame and more Shame; all with capital letters.
They should take a leaf out of ArenaNet’s book who have been releasing episodic Guild Wars content which funds online servers which keep their mmorpg free; it doesn’t matter which episodes you do or don’t own, no original functionality lost. Additional money was made by optional ‘real’ purchases such as cardboard cut outs of the characters/tshirts/badges. All of which do not harm the game if you don’t own them!
Another example I would champion is Siren: New Translation. Sony who I would normally put into the Bungie/Microsoft camp of Shame did a brilliant job of chopping a game into episodes, hooking you in and making you hungry for more by releasing them for cheap.