I had Spark sitting on my drive for a bit and thought I'd give it a try. I updated my working copy to the SVN and hooked my Quake 2 engine derivative to Spark to see how it works. I thought I'd post here as well to detail my setup process, problems I faced, and how I think they should be addressed in later versions of Spark.
First and foremost, I had to compile it. Opened the solution in VS2010, linked it to my SDL directory, and built. There was one tiny error on one of the macros (a new thing in the latest trunk version related to FATAL errors) that yelled at me that an else needed a matching "if"; could not find the problem, oddly enough, but I got it working anyways. This was just noted for the developer, I understand that it's a non-stable version
Once I got SPARK compiled, I had to get the binaries and headers working in my engine. I linked up the debug.lib and added Spark.H/Spark_GL.h to the renderer's main source file. Built; no errors, no warnings, no conflicts. So far so good!
Now came the tricky part: what to do to test? I ended up copying the code in the Flake demo and forcing it to run on every renderer frame. My first run was problematic; you see, my engine uses vertex buffers, and SPARK's default GL renderer disables them without my knowledge consent. This was easily fixed by re-enabling the client states. Second problem: SPARK disables GL_TEXTURE_2D without my knowledge as well, so I re-enabled that after rendering. IMO, you should always check to see which states are enabled/disabled already before you assume it was disabled in the first place
I got the Flake demo to work in my little test map; I was happy at this point, however since it used simple points there was obvious distance issues, so I switched over to the quad renderer, which produced crisp quads and proper visual no matter where I was standing.
Now, there were two big issues that I was facing with this particle engine at this point. The first is probably the most important so I'll cover it first: texturing. Texturing is a nightmare in this engine the way it's set up. It only allows for one texture per group. Now, this makes sense only if you're using particle tables, which you are in your samples; yes, these are indeed more efficient, but you also have to take into account the fact that most games have a lot of particle images, and image sizes are quite limited; it would be impossible to use separate textures on particles in the same group. Since there's no real samples in the trunk on how to address these issues (treat it as if each particle used its own texture rather than coords), I believe that at least a real sample should be given on this.
In my opinion, an abstracted way to display a texture on a per-particle basis should be developed whilst leaving your default implementation of a single sheet intact. My engine, for example, allows for materials on textures, the likes of which may change based on the time the object was spawned (known as "material time" internally); this would be impossible to recreate on SPARK.
Side note, I could not understand how to get SPK::Data working to store a simple pointer per particle; how do you implement "swap"? Shouldn't "swap" contain two pointers to SPK::Data rather than particle indexes?
The second issue is more of a side-issue; culling. There seems to be no easy way to do (abstracted/without modifying engine) per-particle culling based on the current situation. In my game, I'd like to cull particles that are behind the view using a simple dot product calculation, however I cannot find any way to intervene the system to stop specific particles from being displayed.
Those are my two cents on the current state of SPARK2. I have to say, however, aside from all of these problems, the API behind SPARK is incredibly clear and commented pretty well. I do propose, though, that you always include examples of each class that the user is expected to inherit if he must use them (such as SPK::Data), otherwise the implementations get lost in forum posts about 'how to use' and all of that.