Posted by admin on Jun 1, 2010 in Uncategorized
Recently, a discussion came up on a forum about using a game to raise awareness about different technologies related to nuclear power.
Renewable energy is definitely here to stay, and can solve a lot in the short/medium term. However, for some uses it’s just not enough, and at some time fossil fuel will end, or get too expensive, not even considering the damage it’s doing to the environment: a sort of nuclear power will have to come, sooner or later. As the traditional approach, i.e. fission power plants of different kinds, have shown all their limits, leading most people to the equation nuclear=bad, still fusion power isn’t here as scientist have struggled for decades to get a stable device that can produce net power with cheap fuel.
However, unless you’re a physicist involved in one of the projects (or you’re lucky enough to attend the Google Tech Talks), you’re unlikely to have ever heard more than a few references to fusion experimental devices: the only one that makes it to the public is the old Tokamak device (and its newer sibling ITER), and possibly some laser devices.
There are, however, also other promising approaches that are being experimented: taking them into a game would allow the general public to know more and perhaps appreciate the pro and cons.
After a few posts, the discussion is now turning to a proposal of a simple simulation game, where the player chooses the kind of reactor he wants to develop and drives it to the maximum efficiency.
Now, a good part of a game success depends on its name: that’s why you find today a poll on the sidebar where all the suggestions that came up in the thread have been posted. Please vote the one(s) you like, and, if you’ve something to say about, be my guest!
Posted by admin on May 22, 2010 in Generic Cocoa
So, this is my new place!
I’m still trying to organize up things: all previous posts in English have been imported properly, and I’ll add some more content during the next weeks.
Please bear with me, I hope you’ll not be disappointed!
Posted by admin on Feb 28, 2010 in Mac OS X
, Quartz Composer
In February I’ve worked now and then on my Lua plugin for Quartz Composer.
- Courtesy of Noodlesoft, the editor is nicer, and shows line numbers and a marker when an error is detected.
- There is now a sort of experimental support to image type: still no way to interpret them (and, anyway, why?), but they’re properly recognized and passed around
- To implement the images, I’ve had to use undocumented APIs (the QCImage class isn’t public), as there was no (simple) way to get what I needed in the “Official” one…
Posted by admin on Jan 29, 2010 in Mac OS X
, Quartz Composer
I’ve finally found the time to review the pending issues left in my Lua plugin for Quartz Composer. The things that were left out are now working, plus I’ve added a bit of debugging help at the source code level. Still, the target to get a working JIT version (that would speed up scripts) is hold back by the fact that there is no 64 bit LuaJIT yet…
Posted by admin on Oct 15, 2009 in Mac OS X
I’ve modified the command line tool to accept some more params, in order to use as frame source a video input. The code, thanks to the QTCapture framework, is pretty straightforward, and it’s interesting to note that very little was needed to get it working. Clearly, although with little documentation, IOSurface integrates perfectly in the existing technologies!
Posted by admin on Oct 10, 2009 in Mac OS X
To extend the previous sample
, I’ve now added a Quartz Composer plugin that spawns the CLI application: it’s also possible to choose at compilation time (through a #define) if the image has to be provided to QC as a GL texture or a pixel buffer.
A sample composition has been included in the code.
- The embedded CLI application is set as a dependency for the other 2 targets in the project, and should be compiled automatically: however, in certain cases it appears that Xcode “forgets” to apply the build flags, and tries to compile in 64 bits (that fails, as Quicktime doesn’t exist in that universe). To solve this issue, compile the IOSurfaceCLI target separately.
- A bug seems to affect Quartz Composer whenever a movie is started and stopped multiple times: the projection matrix, for some reason, isn’t reset, and the frame appears much bigger than it should. A bug report has been posted to Apple.
Other posts on the same argument:
Posted by admin on Sep 25, 2009 in Mac OS X
Snow Leopard may have looked not so different from its predecessor from the average user point of view: however, for developers like myself, a lot of things have changed, some well advertised (say, GCD
and 64 bits
), some still to discover. A good overview can be found at Ars Technica
As one of the long overdue issues, not to mention all the limitations derived from its venerable age, Apple has introduced a first step to the future of the old Quicktime (that will stay in 32 bit universe) with the new Quicktime X: however, Quicktime isn’t so easy to replace in one shot, and it’s still present in the system, transparently invoked by Quicktime X (or, for us developers, by QTKit) whenever it’s needed.
But, how can a 64-bit software (like the Quicktime X Player, or the Finder itself) use a 32-bit library? The answer is, it doesn’t, the technique used behind the scenes is far more interesting: when a 64-bit software needs a frame from a movie it can’t process otherwise, a specific software is launched (you’ll see it in the Activity Monitor as QTKitServer-(process-ID) process-name) that gives back the frames to the 64-bit app.
Hey, isn’t that nice? Graphics passed from one process to another, how can they do that? The answer looks like it’s in a new framework, IOSurface.
Disclaimer: the following statements are the result of personal experimentation: as such, they don’t represent any official documentation nor endorsement.
IOSurface is included in the new public frameworks, but no mention of it exists in the official documentation: looking at the various C headers, however (not only in IOSurface.framework, but overall in the graphics libraries – Spotlight’s your friend), it’s possible to have a glimpse at its capabilities.
Putting together some sample code
A good example of how IOSurface works could be a quick and dirty implementation of a QTKitServer-lookalike, that plays any Quicktime movie in a 32-bit faceless application, and its 64-bit companion that shows the frames in an OpenGL view. More in detail, a IOSurface can be attached to many kinds of graphic surfaces and passed around to different tasks, that makes it the perfect candidate for our own version of a QTKitServer-clone. The link to the Xcode project is below – 10.6-only, of course.
The faceless movie player
Now, let’s see how to create frames on IOSurfaces. For a start, we can create Core Video pixel buffers (one of the possibilities to define an offscreen destination for QT movies – see the excellent QTCoreVideo sample projects
) that will have IOSurfaces bound to them: when we create the QTPixelBufferContext, introducing the proper items in the optional dictionary will instruct Core Video to attach IOSurfaces to each pixel buffer we’ll get back. Each CVPixelBuffer we’ll get from Core Video will then be asked for the related IOSurfaceRef: IOSurfaceRefs are the references to use inside the same application, and each surface has also a unique IOSurfaceID that can be referred to in other processes to obtain a local IOSurfaceRef.
For the sample I’ve put together, I’ve used the simplest way of passing IOSurfaces, i.e. asking them to be created global and passing around the IDs: not the ideal solution in the long term, but the other way (i.e. passing them through Mach ports) looks more complex and prone to errors to implement without the docs.
The small CLI app gets as the only argument the movie to play, and passes back the IDs of the surfaces through a simple pipe. Using the kIOSurfaceIsGlobal option puts also a limit in the consumer side: as the CLI doesn’t know anything about the consumer, surfaces will be reused as soon as possible, so they’ll have to be consumed at once. Binding them to Mach ports, however, would force the framework to create new surfaces until the previous ports are deallocated.
The 64-bit GUI application
Our 64-bit app is a very simple GUI: nothing really special here, a movie is chosen and passed to our faceless app launched in background as a NSTask, whose output is captured and parsed for IOSurfaceIDs. The interesting part is in the few lines that get the IOSurfaceID and build up a texture that we can use: the new call is CGLTexImageIOSurface2D, that is meant to be the IOSurface equivalent of glTexImage2D used in regular OpenGL to upload images.
The code is only good as a demo of the capabilities and for experimenting, in many aspects a real-world solution will use very different techniques!
Other posts on the same argument: