Planets a-hoy! The benefits of getting up early.

It was an early start, 4:30am to be precise, but that’s the only time when you can catch anything really photogenic in the sky from my back garden at the moment as I’ve yet to get beyond the light pollution for the deep sky objects.

So, yes, the early start, at “stupid o’clock.”

It was a beautiful morning. The sky had lost all of the high, con-trail derived cloud from the night before, which had obscured practically everything and the air was still. It was chilly enough to need a hat and fleece but otherwise comfortable. The stars shone but were nothing beside Venus, the Moon and Jupiter.

Seeing as Jupiter had for so long been out of view I immediately slewed the telescope around to point to it, looked in the eye-piece, focused and discovered that the shadow of one of the moons was passing across the face and was close to the edge. I needed to be quick to be able to catch it in an image so rushed the “Imaging Source” camera out of its box, fitted the Baader filter and the Powermate 2.5x magnifier and started up the software.

After some critical focusing I pressed the “Capture” button and streams of data passed onto the hard disk. I’d made it. Little did I know that the first real capture was the best of the night, which, after later processing, produced this image:

Jupiter with Europa casting a shadow and Io.

5:09am: Jupiter with Europa (lower left) and Io (upper right). Europa’s shadow is just leaving the edge of the face of Jupiter.

After almost an hour of imaging Jupiter, with the glow of dawn swiftly growing, I turned my attention to the Moon. There, in the stark contrast on the edge of the illuminated half sat the crater Copernicus. Such an intricate crater with its ejector field strewn around it. So, this became my second target of the morning:

Copernicus

6:03am: The lunar crater Copernicus.

With light levels increasing and sunrise son to be upon me there was only last target, Venus.

Because I’m hardly ever up this early and because I have no view of the western sky from the observatory I’ve never actually imaged Venus before. This time I didn’t bother removing the camera from the focuser but hoped I’d be able to find the planet using the finder scope only. It took a while to fully centre in on it but eventually I did. After a few minutes of tweaking the exposure, I took my final image of the day:

Venus in the morning sky.

06:14am: Venus shining brightly.

And so, that was that. I stowed away the telescope, shut off everything, closed the roof and came indoors, and off back to bed for a couple of hours.

 

Astronomical events: February to May

My last update on my astronomical exploits was way back at the beginning of February. At that point I’d just got the new telescope installed and Mars was getting closer to opposition.

Between then and the end of May was quite a busy time in the sky when it came to planetary observation as Mars continued to be visible from my back garden for much of the time and Saturn came out to play as well. Unfortunately, after the third week in May both planets were obscured by the house by the time dusk fell. Even so, I managed to get some decent images.

Mars

I’d already managed to get one good image of the planet by the beginning of February but due to cloud cover the the next opportunity wasn’t until early March. Thankfully, there was a short period of very good seeing, allowing me to even image clouds developing within the atmosphere.

Details on the surface weren’t that clear, but that was partly due to the low quality of the camera and its low speed with the light levels available. Once the atmosphere became more turbulent the results were no-where as good.

So, I decided to buy a better quality planetary camera. The resolution was still 640×480 but its noise levels were far lower, so that even without getting the colour balance correct on the first go I managed to get a far better image:

Unfortunately, by the end of March the weather closed in and the next time I could get out to view the sky was in May.

By the middle of May the planet was rapidly receding from view, markedly shrinking and becoming harder to image, especially as by the time it was visible it was almost behind the house. However, because of the change of angle, it was far easier to see that it was indeed a sphere as it was easy to see its phase:

Before long, however, it became impossible to image.

Saturn

Saturn wasn’t easily visible from my garden until the clouds cleared in May, which gave me only a few short weeks in which to view and image it before it too disappeared behind the house and into the dusk. Also, most evenings the sky was just too unstable to get decent images of the planet given the amount of magnification required. Having said all that I did manage a few really pretty decent images such as this:

Weather based computing definitions.

Cloud Computing

A computing resource located “out there” somewhere, connected to the Internet and operated by a third party.

When the heat is on, just like real clouds, they can either evaporate or become a storm (see Monsoon Computing). In either case it’s not good news.

Fog Computing

Like Cloud Computing but down to earth. i.e. based in reality and generally under the organisation’s direct control. Often called a Corporate Cloud Computing resource.

This generally hangs around longer than is required but never lets the temperature get too high.

Mist Computing

You’re sure that you purchased the equipment for your corporate cloud computing resource, but you can’t see very much of it and it’s not a lot of use.

Very Light Drizzle Computing

You’re pretty sure that there must be a computing resource somewhere, you can feel it, but you can’t find it.

Drizzle Computing

You seem to have a large number of light-weight and low powered computing systems for your processing. However, all they seem to do is annoy you and never actually do anything useful.

Rain Computing

You have a large number of independent computers all working to solve your problem, or at least dissolve it.

Stair-Rods or Monsoon Computing

Somehow you seem to have huge numbers of high power processors on your hands, all working on your problem uncontrollably. Unfortunately, the upshot of this is that your problem isn’t solved, it’s washed away by the massive deluge of cost and possibly information overload.

So, do you have any more/better amusing definitions for weather analogous computing names? If so post them as comments below.

NotSoBASIC

As discussed in a previous posting, I’ve been musing over the development of a modernised version of the classic procedural BASIC language, especially with the Raspberry Pi in mind.

With this in mind I’ve been setting out some goals for a project and working a little on some of the syntactical details to bring structures, advanced for-loop constructs and other modern features to a BASIC language as backwardly compatible with the old Sinclair QL SuperBASIC as possible.

So, here are the goals:

  1. The language is aimed at both the 13 year old bedroom coder, getting him/her enthused about programming, and the basic needs of general scientist. (Surprisingly, the needs of these two disparate groups are very similar.)
  2. It must be platform agnostic and portable. It must also have a non-restrictive, encumbered license, such as the GPL, so probably Apache, so as to allow it to be implemented on all platforms, including Apple’s iOS.
  3. It must have at least two, probably three, levels of language, beginner, standard and advanced. The beginner would, like its predecessors in the 8bit era, be forced to use line numbers, for example.
  4. It must have fully integrated sound and screen control available simply, just as in the old 8bit micro days. This, with the proper manual, allow a 13 year old to annoy the family within 15 minutes of the person starting to play.
  5. The graphical capability must include simple ways to generate publishable scientific graphical output both to the screen and as encapsulated Postscript, PDF and JPEG.
  6. The language must have modern compound variables, such as structures, possibly even pseudo-pointers so as to be able to store references to data or procedures and pass them around.
  7. The language should be as backwardly compatible with Sinclair QL SuperBASIC as possible. It’s a well tested language and it works.
  8. The language should be designed to be extendable but it is not envisaged that this would be in the first version.
  9. The language IS NOT designed to be a general purpose application development language, though later extensions may give this ability.
  10. The language will have proper scoping of variables with variables within procedures being local to the current call, unless otherwise specified. This allows for recursion.
  11. All devices and files are accessed via a URI in an open statement.
  12. Channels (file descriptors) must be a special variable type which can be stored in arrays and passed around.

As I said earlier, I’ve been thinking about how to do a great deal of this syntactically as well. This is where I’ve got so far:

[Edit: The latest version of the following information can be found on my website. The  information below was correct at 10am 23rd February 2012.]

Variables.

Variable names MUST start with a alphabetic character and can only contain alphabetic, numeric and underscore characters. A suffix can be appended so as to give the variable a specific type, e.g. string. Without a suffix character the variable defaults to a floating point value.

Suffixes are:

$ string
@ pointer

Compound variables.

Compound variables (structures) can be created using the “DEFine STRUCTure” command to create a template and then creating special variables with the “STRUCTure” command:

DEFine STRUCTure name
varnam
[…]
END STRUCTure

STRUCTure name varnam[,varnam]

An array of structures can also be created using the STRUCTure command, e.g.

STRUCTure name varnam(3)

The values can be accessed using a “dot” notation, e.g.

DEFine STRUCTure person
name$
age
DIMention vitals(3)
END STRUCTure

STRUCTure person myself, friends(3)

myself.name$ = “Stephen”
myself.age = 30
myself.vitals(1) = 36
myself.vitals(2) = 26
myself.vitals(3) = 36

friends(1).name$ = “Julie”
friends(1).age = 21
friends(1).vitals(1) = 36
friends(1).vitals(2) = 26
friends(1).vitals(3) = 36

As with standard arrays, arrays of structures can be multi-dimentional.

Structures can contain any number of standard variables, static arrays types and other structures. However, only structures defined BEFORE the one being defined can be used. Structure definitions are parsed before execution of the program begins. Structure variable creation takes place during execution.

Loops.

FOR/NEXT:

FOR assignment (TO expression [STEP expression] | UNTIL expression | WHILE
expression) [NEXT assignment]
..
[CONTINUE]
..
NEXT [var]

The assignment flags the variable as the loop index variable. Loop index variables are normal variables.

The assignment and the evaluation of the assignment expression happen only once, when entering the loop. The test expressions get evaluated once every trip through the loop at the beginning. If the TO or UNTIL expressions evaluate to zero at the time of loop entry the commands within the loop do not get run.

The STEP operator can only be used if the loop index variable is either a floating point variable or an integer. The expression is evaluated to a floating point value and then added to the loop index variable. If the loop index variable is an integer then the value returned by the expression stripped of its factional part (as with ABS()) before being added to the variable.

WHILE/END WHILE:

WHILE expression [NEXT assignment]

[CONTINUE]

END WHILE

Equivalent to a FOR loop without an assignment using the WHILE variant e.g.

x = 10
WHILE x > 3 NEXT x += y / 3

END WHILE

is equivalent to

FOR x = 10 WHILE x > 3 NEXT x += y / 3

NEXT

DO/UNTIL:

DO

[CONTINUE]

UNTIL expression

The commands within the loop are run until the expression evaluates to a non-zero value.

Functions and procedures.

A function is merely a special form of a procedure which MUST return a numeric value. The suffix of a procedure determines its type, in the same way as variable names.

DEFine PROCedure name[(parameter[,parameter[…]])]

[RETURN expression]
END PROCedure

DEFine FUNction name[(parameter[,parameter[…]])]

RETURN expression
END FUNction

Parameters are local names with reference the passed values by reference. This means that any modification of the parameters within the procedure will change the value of any variables passed to it.

Variables created within the procedure will be local to the current incarnation, allowing recursion. Variables with global scope are available within procedures but will be superseded by any local variables with the same name.

Joining the fast lane: Fibre to the Cabinet broadband Internet access is here.

Well, after quite a wait the Cowley BT telephone exchange has finally been enabled for Fibre to the Cabinet (FTTC) broadband. Even using BT’s own estimate, the exchange has been nearly two month late coming on-line.

So, what does having the new service involve?

Well, other than a hefty £80+VAT fee, it merely requires a BT Openreach engineer to visit your house and install a modem and additional face-plate filter onto the house’s master “line” socket and then go to the street cabinet containing your connection to rewire it. You will also need a firewall/router which can talk PPPoE. In other words, one which can use a network cable instead of a phone cable. These are the same as those used with Virgin Media cable-modems.

Although BT (via your ISP) will inform you that the process will take up to an hour, in fact it takes a lot less time than this, It’s about 5 minutes for the engineer to unpack the new modem and fix the faceplate and then a further 10 minutes while he hunts for the correct street cabinet and re-wires your phone line. Assuming that you have your router fully set up beforehand that’s it. He just does a few tests and leaves.

In my case, I had a Billion BiPAC 7800N router which can do both ADSL (phone line) and connect via a network cable so all I needed to do was change a setting and reboot it.

So, this, after some tidying, is my new communications system:

Now that everything’s wall mounted and I’ve put all the wires into a conduit it looks a whole lot neater than before. Also, it’s unlikely to be knocked or cables snagged.

At the top of the picture you can see the Billion router. It’s not much to look at but it is a superb router. I do like the way that it can be mounted vertically on the wall, thus taking less space laterally.

Below the router is the BT modem. Thankfully this is the mark 3 model so is less likely to die horribly.

Finally, connected directly into the wall power socket is the Devolo 200Mb/s power line networking module. This connects to a similar unit in the spare bedroom, where my server sits, and to a multi-port power-line network switch in the living room to which is connected the TV, PS3 and amplifier.

So, what does all this shiny new equipment give me over and above what I had before? Other than the 10 times download speed increase and the four times upload speed jump, it also means that the connection should be far more stable. I’m also only paying about £3 more for this service than I was for the ADSL MAX service I was previously on and I get an extra 30GB of download quota bundled in with it.

Basically, I’m happy with it and that’s all that matters.

[Edited to add historical broadband speed test data]

Mars attacks

Well, Mars is finally getting into a position where I can image it at a reasonable hour of the day (or night).

So, ths evening I had a doze in bed for most of it before getting outside and attempting to get at least one image of Mars.

Mars is rather more difficult to image tham Jupiter due to it being far smaller. Thankfully, for Christmas I got a 5x magnification lens which allows me to make the image on the “webcam” camera large enough to even attempt the task. Even so, with the degree of atmospheric disturbance I never thought I could get anything decent tonight.

Well, surprisingly, after a bit of processing in Registax, this is what I managed to obtain:

 

Astronomical observatory: News update

I’ve been rather quiet about the observatory and astronomy in general in this blog for quite some time now. (Actually, looking back it was last June.) So, what’s been going on?

Observations.

When the weather has allowed I’ve been out a number of times, the most fruitful being around the end of October when Jupiter was close to opposition and hence the largest it can be.

During this time the weather was helpfully quite good with a number of days of exceptionally stable air giving rather good viewing conditions (known as “seeing”). which gave me a good opportunity to try out a time-lapse video of Jupiter turning.

It took a few hours to image Jupiter using a modified web-cam device and then a further half a day to process each of the video clips into single images such as this:

Each of the images was then again put into Registax, aligned and then made into a time-lapse movie:

Which does look rather impressive.

Unfortunately, after that period the weather hasn’t been very kind and Jupiter has drifted further away from us as we journey around the Sun. Still, Mars is on its way!

After the purchase of a new telescope at the new year (more later) I’ve managed to get out only a couple of times, mostly to do tests. The best image so far has been of the Orion Nebula taken by imaging a number of times with my Nikon D90 and then combining the sub-frames into a final image:

It could be better. The telescope mount wasn’t fully calibrated, I need a light pollution filter and the stars were twinkling wildly due to disturbed air. Still, it’s a start and far better than I could achieve with the Meade LX90.

Hardware

Most of my time in the observatory has been spent trying to find ways to stop water seeping in and causing mould to form. In the end I had to spray the whole of the outside of the base with a rubber sealant paint and drill some drain holes in the roof roller rails. These seem to have mitigated the problem but I’ll know better when there’s some more heavy rain.

Other than that, the biggest news is the purchase of a new telescope with a “German” equatorial mount. The fact that the new optical tube is larger too is a side issue as the difference in price between the sizes wasn’t that great and so the biggest which would fit the observatory and the budget was the one I went for.

The reason for the change was two-fold. Firstly, the Meade LX90′ mount is rather crude and wanders all over the place, making it useless for even medium length imaging of deep-sky objects. Secondly, the fork mount, when combined with an equatorial wedge, excludes much of the sky as the mount gets in the way of the camera.

Anyway, in the end, after quite some extensive research I decided upon the Celestron EdgeHD 1100, 11″ scope. That, along with the hardware to fit it onto the observatory’s pedestal and guide scope, cost a pretty penny.

It was only after I mounted it in the observatory that I found the one big problem with the guide scope (other than the weight) and that was that it won’t fit in the observatory. (In fact, the main scope only just fits when it’s at its easterly range of travel.) Still, other than that “Doh!” moment everything’s fine.


The new ‘scope at home.

Well, that brings things up to date really… More to do in the future.

My 30 year personal computing odyssey… So far.

The Journey Begins.

Sinclair ZX81

It was almost precisely 30 years ago today that my journey into the world of computing began. I remember the day that my parents bought the Sinclair ZX81 which was to become my Christmas present, we’d gone to Bedford to buy it in W.H.Smiths and it came in a brown cardboard box with nothing printed on the outside. We’d then all got into the car and whilst we drove up past St. Neots towards some shop on the Cambridge road I was able to open the box and start to read the manual. (We didn’t find the shop in the end and I can’t remember what it was supposed to be selling. Instead we turned back at a small roundabout and drove home.)

At the time I thought of computers as literally magical things. I’d seen them on “Tomorrow’s World” where a year or so before they were extolling the new technology which now cost less than a thousand pounds (showing the TI 44/9). Other than this I’d only seen computers on “Horizon” or in science fiction but here, now was one sitting in a small box on the back seat of my parents’ car beside me. I also marvelled at the ZX81 manual with its painting of a science fiction inspired landscape. (Why are computer manuals so much more boring these days?)

As for programming, at this time I’d only overheard conversations from my class mates at my new school who had had some lessons in the science block. They talked of mystical incantations and something to do with “print, print, print.”

Of course, this being a Christmas present, once we got home it was put away in a top cupboard, out of sight. But still, that was the beginning of the journey.

One Small Computer For A Man…

And so, Christmas Day came and I was at last able to get my hands on the ZX81. It was set up on a chrome steel and glass coffee table and connected to our old “Elizabethan” 12″ black and white portable TV which we’d used in the caravan on holiday. I already had a “Binatone” cassette recorder, which I remember getting for my birthday in ’77, but at this point it wasn’t able to be used as I had no tapes with software on. However, the Christmas of 1981 was spent cross-legged tapping at the flat plastic membrane keyboard typing in the examples from the manual.

It wasn’t long, however, until I soon hit the limit of the 1K memory, so my progress stalled for a while. It wasn’t until my birthday in February that I managed to get the 16K RAM Pack. Wow! How could anyone fill a whole 16K?! Well, I certainly couldn’t.

Anyway, at this point I think I should start compressing the time scales otherwise this post will become a book. Suffice it to say that the ZX81 was my mainstay computer for a further 15 months and it taught me the basics. It also taught me how to be patient after spending one and a half days typing in hex code out of the “Your Computer” magazine only for a thunderstorm to wipe out my work. A further two days of typing later and a rudimentary “Space Invaders” game was ready to play, which worked for about a minute until it crashed due to a typo somewhere in the pages of code.

The Steady March of Progress.

In the May of 1983 I finally persuaded my Dad to help me buy a replacement computer, a ZX Spectrum 16K. At the time this cost a huge amount, £125. Well, at the time £125 WAS a lot of money, at least for my family. Of course, the timing was awful as only a couple of weeks later Sinclair dropped the price of the Spectrum so that £125 would get your the 48K model. Later in the year I sold the ZX81 to one of my Dad’s work mates so I could buy a Fox Electronics 48K upgrade as many of the games I wanted to play by now required the larger memory. (Can you remember when games were all £4.99? Wasn’t it a scandal when they suddenly jumped to about £6 a pop?! :-)) I later bought the ZX81 back from the person I sold it to for a profit and it’s now in my loft.

The Speccy was the machine upon which I did most of my first real world work. This was helped by the addition of the Interface 1 and ZX Microdrives in the summer of ’84 along with the first printer, a Brother HR-5 thermal ribbon printer which could output at an amazing 30 characters per second. This combination took me right through to half way through my degree, upon which I wrote most of my essays using the “Tasword 2” word processor.

During this period I made my first computer purchase mistake. During the latter months of 1984 I had been reading “Your Computer” magazine and getting more and more enthused about the Memotech MTX series machines. They were sleek (for the time) and they even professed to have a ZX Spectrum emulator in the works. Best of all, they had a built in debugger/assembler/disassembler on board just like the “professional” RML 380Z I’d seen and used at school. How could it be bad?

So, after saving up my student grant (yes, they were magical things too) by basically not having a social life in the first term at Uni. (this wasn’t a concious decision) I spent £199 on a MTX500. This was a very bad move. The machine itself was OK, but being basically an MSX machine but without the compatibility and software being expensive and hard to come by it was a bit of a lemon. The Spectrum still got more use.

And On, Into The Future.

Sinclair QL

In the January of 1986 I managed to convince my Mum that I needed something more capable to do my University work upon and so along came the Sinclair QL.

This was a major leap forward. Not only did it come with a full office suite of programs, including a word processor, spreadsheet and database application, but it also had a procedural BASIC programming language and pre-emptive multitasking. i.e. Welcome to the modern world.

Suffice it to say that this machine was invaluable for my University work, not only as a word processor upon which I wrote my degree mapping project report (I won’t go into the story of the power cut in the halls as I was writing the conclusions) but it was also used to write programs to do some of the project work, such as normative mineral analysis and plotting up data for the remote sensing coursework.

It was also the machine which really got me into low level programming and assembler. QDOS is/was a beautiful and simple operating system to code assembler on and Motorola M68000 assembler is really quite high level, the combination of which made it simple to write programs. The high-water mark of which, for me, was a full emulation of the University College London BBC Micro terminal emulator engineered from their documentation. It was a combination of a DEC VT52 emulator and a Tektronix T4010 graphics terminal emulator with access to the BBC’s *FX commands.

The QL also acted as a my development machine for many projects during my MSc in Computer Science, especially those involving assembly coding. In a way, this is THE machine I learnt the most from.

Onwards and upwards.

I’m now going to speed up a gear and skim past my first floppy disk drive in ’87, the second hand BBC Micro to play Elite in the December of the same year and even the Atari 520STM in the summer of ’88. No, the next “big thing” was the first hard disk drive in 1989.

It was a revolution! You could store huge amounts! It was fast! It was expensive! Wow!

Actually, other than the first and statements these would seem laughable today. The device was a 28MB drive for the Atari ST and cost a whopping £400. In today’s money you should probably at least double that figure. Today 28MB would seem like a pitifully tiny amount of storage, enough to hold a couple of images taken with a digital camera, but it seemed cavernous. This was helped by the fact that the ST could only use a modified version of the FAT12 file system and the hard disk drivers could only use disk partitions up to 4MB in size!

Oh, and as for the the statement, “it was fast”, well all things are relative. There was a disk speed testing program which came with the disk utilities which could measure the sped of your drive. Bear in mind that this drive was a Seagate SCSI device… the maximum read speed was about 600K/s and writes maxed out at about 400K/s! Today I am getting similar speeds from my ADSL connection and I’m not that close to the exchange.

The Technological Slow Down.

Up until now it seemed that every year brought a new wonder. Indeed, with the arrival of first Minix and then MiNT on the Atari ST and TT030, I was getting closer and closer to having a UNIX box in the house. 

My home computing before the PC era.

Before the attack of the IBM PC clones

Actually, in 1993 I picked up a Sun 3/80 via Alec Muffett and then purchased for about £500 a Seagate 425MB hard disk to get it to run and then I DID have a UNIX machine at home. Things were looking up! 

After the PC revolution

My home computing set up in 1995, after the arrival of my second PC. 486DX2-66.

It wasn’t until 1994 that I made my first steps in the “PC” world, picking up the bare bones of a 386SX machine and then sourcing the components to make a working system so that I could try out this new Linux thing and play with Microsoft Windows. Overall I think it cost me another £500 or so to get it running.

Still, it was essentially the end of the “boost phase” of home computing as far as I was concerned. At this point I had effectively, be it in primitive form, everything I have here today. I had a network (10Base2), UNIX and Linux machines, a Windows box and Internet connectivity (albeit via dial-up modem). From then on it was merely a case of a gradual improvement in speed and usability.

Until….

Enter the Age of the iDevice.

iPhone

Yes, I can say that we have now entered a new phase of the computing story. It’s both a very good and a very bad thing.

Effectively, for me this was preluded many years before I got my first Apple when I got my Palm Pilot Pro and mobile phone (Motorola MR30 brick) in ’97. But it wasn’t really a revolution until I got my first smartphone in 2003, a Handspring (later Palm) Treo 600. It only had GPRS connectivity but it was e-mail on the go! It had limited web browsing. It was amazing at the time. (It also had amazing battery life as well, but that’s another story.)

But it wasn’t until I got the iPhone 3G that I really found how mobile connectivity should be. Simple, sleek, quick and it “just worked”. The iPhone 4 was just as good.

However, the bad thing about all these devices is the way that the iDevice simplifying of devices is starting to intrude onto the desktop (and laptop) devices. Locking the users out of being able to access and program them. It’s almost as if you’re only buying the privilege to hold and use the devices rather than own them. This is a potentially slippery slope.

Anyway, I’ve been rambling on for far too long now. So, I’ll conclude this piece and look forward to hopefully another 30 years of the odyssey to come. I think it’s going to be even more evolutionary rather than revolutionary.

[Edit: 7:50pm 12th November, 2011. : Replaced stock image of Sun 3/80 with image of my computer set-up in 1994 and 1995.]

On the fly VMs: Viable security model for downloaded apps?

I’ve been thinking… always quite dangerous I know…

I woke up early this morning and couldn’t get back to sleep and for some unknown reason I started thinking about downloaded applications and how to prevent trojans getting a hold. Then it came to me, why let the application have real access to the system, especially the filesystem?

I started wondering how feasible it would be to modify the operating system to create on the fly a virtual machine which is a clone of itself within which an untrusted application is run. This VM would not have any real write access to the filesystem but instead would have a copy-on-write shadow copy of the real one. For performance reasons it would have to have pretty transparent access to the graphics sub-system but this shouldn’t be too high a security risk. Once the application had terminated the filesystem write operations could then be vetted and a risk assessment and “reputation” for the application could be determined before actually making the changes to the real data on the disk.

Later on the application could either be manually unrestricted or, if it’s “reputation” was above a certain threshold, unrestricted manually.

Anyway, it was just a thought.

[Edit] More thoughts added as a comment.

Google+: Cooking with the curate’s egg?

About a week ago I managed to get hold of an invitation to Google+, the new, not quite publicly available, in development, nascent social site Google are toying with. It’s got quite a “buzz” campaign running about it at the moment and all the Technorati are flocking to use it. But is it any good? Or, more importantly, could it become good enough to win main-stream users from Facebook?

Well, it does have a lot going for it. For a start the interface is clean and the management of the social groups is light years ahead of Facebook’s. There are issues with some of the privacy decisions made in the design, such as limited circulation posts becoming visible to those outside the initial distribution is one of the people within the circle posts a comment with public distribution. However, these are teething problems and the site is still very much under development.

There is currently no API for external applications to be built, such as games. For some people this is a major problem, for others it’s a blessing. It has been stated that a development system is being developed so I don’t see this as a road block in future.

The feel of the site has one major down side for a social site currently. The whole experience seems quite solitary. This isn’t because of the lack of people to “friends” with but more that you have no idea if any of your friends are currently on-line. You may not want to interact with them there and then but it’s nice to know that they’re about.

The other problems I see currently is that Google+ seems to be mostly gluing other Google services together. The imaging uploading and sharing is done using Picasa, which isn’t ideal for the posting of quick images on the go from a smart phone. The messaging service is a poorly integrated link to Google Chat.

One of the most interesting new facilities which could actually make people prefer Google+ over other systems could be the “Hangout” audio/video conferencing and chat sub-system. However, this is crippled by two problems currently. The first one is related to the fact that you don’t know who’s on-line at the moment. i.e. you can’t just invite those you know who are around for a chat, you have to invite blindly. The second one is that you have to download and install a plug-in for your browser for it to work.

So, do I think that it could rival Facebook in the end. Hmm… at the moment I’m not sure. There are currently too many things which make it less immediate and interactive with regards to interacting with your friends. Also, currently the reliance on glued on functionality from other Google services which don’t quite match with a social sharing system could well be a long-term problem.

So there you have it, at the moment it’s a curate’s egg, good in parts. I don’t want to damn it so early in its development but I am a little worried that the early reputation may stick. Let’s hope it does come to rival Facebook as that needs competition, especially as the developers seem to be getting into the Firefox and Gnome developer’s mind sets and changing things for change’s sake and seeing themselves as the only arbiters of good design.