Exploring Playnite

Hey Folks! This is going to be a bit of a niche topic, but I guess in truth pretty much everything that I talk about is a niche topic. Most people do not give a shit about the minutiae of whatever build I happen to be working on in an ARPG that I have spent over 2000 hours playing. Anyways… some years back there was a dream proposed by GOG of having a single interface to interact with all of your games. The problem with that was that GOG Galaxy was a bit of a mess. It consumed a ton of resources and regularly disconnected from all of the platforms you were capable of connecting it to… never quite living up to the promise of giving you a single interface to view and launch all of your games through a single application. A few months ago I heard about an open-source project called Playnite that is attempting to actually make good on this dream and yesterday I finally got around to installing it and checking it out.

The base install of Playnite supports Amazon Games, Battle.net, EA App, Epic, GOG, Humble Bundle, Itch.io, Steam, Ubisoft Connect, and Xbox Gamepass. There are also a wide number of community-supported addons that grant further access to other game libraries. For example, I am contemplating installing the XCLoud integration that would add all of the games that I have access to via a web browser and Microsoft Gamepass on XCloud to the interface. There are also addons for platforms like Nintendo or Playstation that just give visualization into your total game library without actually allowing you to launch them. There is apparently fairly robust support for emulators and launching the game titles that you have sitting in rom/iso form on your system, which is something I will likely explore at a later date.

One of the aspects that I dig so far is that there is extremely robust support for filtering and grouping your games. Right now I am using the most simplistic grouping of “installed” versus “not installed” but I could easily do it based on genre or any number of other elements and also include any sort of sorting parameters that I might want as well. All of this seems to only take around 100 meg of memory and negligible system resources. I will say though getting everything set up initially required some manual intervention and several minutes of downloading media assets. Especially when it came to humble bundle I had to do a good deal of searching to find the specific game that was being referenced, and also a fair bit of hiding things that were not actually games. It was worth the hour or so that I spent to get things streamlined a bit.

The end result is a nice single dashboard giving you access to all of the games you have installed on your system. You can manually add games or have it search specific locations and then decide which games you are going to add manually. For Final Fantasy XIV I use the default launcher, so I had to manually add it. The only negative of all of this however is that many of the addons require you to have the official platform client installed somewhere on your system in order for the integration to work. I ran into this for example with Itch.io a client that I have never had installed before. That said the launching still works fairly seamless despite still requiring you to have the storefront-specific launchers installed on your system.

I think more than anything the thing that I enjoy the most so far is how fast I can rapidly switch between views of the games I have available. For example, I have all of these games that Twitch Prime has been giving me for years… that more or less were invisible to me. I was never going to install a dedicated client in order to play them, but thanks to Playnite and the Explorer view I can see everything that is being granted to my account by Amazon in a single location. There seems to be some weirdness with Humble Bundle though because I know for a fact that there are way more games that I have access to that are not showing up. I did notice a hide third-party games option that I probably need to tick off in order to get the rest of the humble stuff to load given that a lot of that was either tied to Steam or Origin.

I mean it isn’t perfect but it is a step closer to having a single launcher for all of the games that I have on various platforms. It also makes me realize how many games I have “entitlements” to on multiple platforms largely due to giveaways from Epic Game Store, Amazon, etc. For those with a massive backlog and that are seized by analysis paralysis… there is a random game button. That way when you are having one of those “nothing to play” moments the launcher will choose for you. It is worth a look if you were interested in GOG Galaxy but annoyed by its poor performance.

Fun with Stable Diffusion

Good Morning Friends and Happy Juneteenth! Yesterday was my Birthday in addition to being Father’s Day, and I am super thankful for all the assorted well wishes. This morning is going to be a bit of a departure from my regularly scheduled ARPG nonsense. Over the weekend I spent some time messing around with Stable Diffusion running locally, and I thought I would talk a bit about it. Let’s get some stuff out of the way first. I do not condone Art Generation models as a method of replacing the work of actual artists. You will see a bunch of images adorning this blog, and they are all for the most part the paid creation of my good friend Ammo. In fact, as we speak she is working on yet another one of my hair-brained ideas and I have no clue what the final bill will even be, but will pay it happily as I always do. For me, the “AI Art” landscape is more a toy than a tool and in the past, I have enjoyed feeding it nonsense and seeing what it comes up with out of that chaos. For example… I have no clue who the fuck this dude is but any time I feed it a prompt with the name “Gideon” in it, I end up with this visage.

I’ve known for a while that you could run various generation tools locally off your graphics cards, but I always assumed it would be a tedious process. I started down this present rabbit hole when I found out that there was literally a one-click installer that set everything that you needed up for you. There is a distribution of Stable Diffusion called “Easy Diffusion” that offers a quick install for Windows, Mac, and Linux, and within about 15 minutes I was up and running and generating nonsense. I am sure there is probably something inferior about the path that I took to get to this destination, but I honestly don’t really care. I wanted to see if it could be done, and have gotten sufficient enjoyment from this digital version of wooly willy.

The end result is a web server running on your local machine, that you can then access from any other machine on your network. By default, this runs on port 9000 and gives you a fairly detailed interface to control the process. You are going to need some prompts and honestly… the best guide that I have found about this process is this one. More specifically it was helpful in understanding the concept of negative prompts… aka the things that you want to steer the engine away from creating. This is going to be really important if you are working with anything that could be considered the human form because like a pre-teen boy… it seems to be drawn to replicating boobs in the strangest ways. There are negative prompts that you can supply to the process in order to sort of steer it away from that particular uncanny valley.

The other thing you are probably going to want is some more models to play with. You could get really deep in the weeds in trying to explain exactly what a model is… but effectively think of it as encoded data that tells Stable Diffusion how it should produce images. The best place for these seems to be CivitAI.com and on the front page, you will find a number of the more popular options. I’ve played with several of these and after fiddling around a bit, I think I probably like DreamShaper the best because it tends to lean towards more imaginative imagery rather than attempting to replicate reality. Wherever you installed Easy Diffusion, you should find a directory along the lines of “EasyDiffusion\models\stable-diffusion”. When you download a model from CivitAI it will be a “.safetensors” file and you just need to drag it into this directory in order to use it. Something important to note… each model is roughly 2 gigabytes in size so they can rapidly fill your drive if you download too many.

Let’s walk through a multi-hour deep dive that I did yesterday in trying to get something interesting out of the generator. I wanted a Kaiju attacking a city, but I assumed that maybe the model wouldn’t understand the term Kaiju. Essentially when writing a prompt I find it best to sort of use simple language. So for example this is the prompt that I supplied to get the above image.

A giant monster attacking a modern city with the military fighting back against it

This was maybe a little too on the nose and had a distinct Godzilla feel to it. Since I tend to generate five images at a time, you can often see it going in a bunch of directions. Some of these were more akin to something that you would find in a Doom video game, but I mostly liked the general direction it was going. Essentially what I usually do is feed the image that was just generated back into the prompt in order to iterate on that idea. I find that over the course of what ends up being a few hundred generations, it slowly narrows down the focus to get closer to what you were actually wanting.

I wanted to go in a different direction, so I tweaked my prompt a little bit and fed it the previous image.

A giant monster that looks like Cthulhu attacking a modern city with the military fighting back against it

Basically, I supplied that I wanted it to look more like Cthulhu to steer it away from Godzilla… which worked like a charm. However, remember that bit where I said that the models seem to really want to draw boobs? I included this specific image just to show that point.

I took a bit of a pivot because I didn’t want this to end up being a pretty boring drab scene. So for the next prompt, I started adding some style elements to it. I also wanted the city to look a bit more ruinous.

a vaporwave scene featuring A giant monster that looks like Cthulhu attacking a modern city knocking down buildings some of the rubble on fire

I noticed that some of the elements of the monster how were mechanical looking, which made me start shifting in a slightly different direction. Could I get this to make it give me a scene of a Kaiju fighting a Robot over a ruined landscape? Unfortunately at this point going forward… I don’t have exact prompts. The images that generate are named based on your prompt, and my phrase got too long for the file character limit.

I kept the prompt above but instead added that it was fighting a Giant Robot that was wielding a laser sword. The generator got confused as it often does… and just started making the monster look more robotic. While extremely cool looking… this was not really what I was going for, but it was a key step in the process nonetheless.

I tried a few more rounds of generation, feeding my favorite from the previous round into the next round… but no matter what I tried it seemed to be hung up on a single “monster” figure. I am including this one mostly just because it looked pretty freaking badass. I have no clue what is going on with that building but I am on board with the bio-organic mech with a giant laser sword thing that is going on here.

It was around this point that I decided that I needed to tweak things up a bit further. So I specifically called out that there were two figures, the monster, and the giant robot and this finally began to produce paydirt. It started out a bit ephemeral at first, with this pseudo-robot-building thing in the background. However, that gave me a thread to expand on, again feeding one generation into the next round of five generations.

After a few more rounds of generation, the idea was starting to finally take root. I say idea because in fiddling with this nonsense, it does seem like the generator gets something stuck in its head and you have to sort of forcibly dislodge it at times. I was getting somewhere though, but knew it would take many more instances of taking an image that was the closest to what I wanted, and then feeding it into the next loop… and doing this over and over until the results started to turn in the direction I wanted it to.

Then after legitimately two or three hundred rounds of this nonsense and the course of an entire afternoon wasted… we have this glorious piece of nonsense. I think what I find so interesting about Stable Diffusion so far, is that it can serve as this rapid ideation platform. If you want to quickly iterate on some ideas you had in your head, you can come up with something that is still… very visually wrong at times but contains the flavor of what you wanted. I could see this being honestly an amazing tool for an “actual artist” to test out some ideas and have the machine keep iterating on something until they get a layout and subject matter that they wanted… that could then serve as the scaffolding to build something interesting. Even for wordsmiths, this could be super interesting because I can already wrap a story in my head around what is going on in this picture.

It is also sometimes just super interesting to feed it a prompt and see where it goes. This is a hundred or so generations off of the prompt “Belghast”. No clue why but it seemed to really latch onto a military and zombies theme when I used my chosen moniker as the prompt.

This delightfully ominous scene was generated with this prompt:

a skinwalker stalking a group of hunters in the woods

So while I would not at all consider that to be a successful prompt as far as subject matter goes… it still looks freaking cool nonetheless. Again this image has a story to it that is just begging to be told. I feel like trying to make one of these image generators create exactly the image that you were wanting… is a path to madness. However, if you sort of go with the flow and iterate on the patterns that you are seeing emerging… it can produce some really interesting things. While I don’t exactly consider this art on the same level as the things being produced for me by Ammo… there is definitely an art form that is emerging from guiding the machine. It somewhat reminds me of carving a woodcut block, and allowing yourself to lean into the imperfections of the material… rather than trying to fight against them.

Anyways I thought this was interesting. No clue if anyone else cares about it, and I have no clue if I will ever walk through one of my generative steps again in the future. The cool thing about this blog is it is a “me” blog more than it is a blog devoted to any one particular topic. I’m enjoying creating nonsense with Stable Diffusion and thought I would share that with you all.

Moonlight Game Streaming

If you have read this blog for any period of time, then you have probably seen me extoll the virtues of Parsec for game streaming. If you have not, then a quick 500ft view is that my world works a little different than your average gamer. I have one beefy but aging gaming machine upstairs, but I also spend a lot of time downstairs on my laptop. Said laptop is old enough that it is long beyond any useful ability to play games on it with its GeForce 960M graphics card in it. However I mitigate this fact by streaming games from my upstairs machine to the laptop over my local area network. I’ve been doing this since 2018 with pretty solid success after having tried a few other options that never quite panned out. There are however a bunch of things that you just sort of take for granted while using Parsec.

First off the connection is going to have intermittent lag causing the audio and controls to have what I can only describe as a “hiccup” where the audio drags for a moment and controls go a little wonky. If this happens at the wrong time it could mean a death, so I find I kinda play accordingly as to not press my luck. The other problem is that the video can artifact something fierce if there is a large amount of movement going on with the screen. For example if a game has rain… prepare for a pixelated mess until you get indoors. I found this particularly bad in Minecraft for some reason, making it extremely hard to play if there was rain or snow happening on screen. These are all things that I have just sort of dealt with because it was the price of entry for being able to play games on my laptop remotely.

The challenge however is that over the last year these have seemingly gotten worse. I know with the pandemic, Parsec has been selling its services heavily not just as a game streaming platform but as a super duper terminal services client. No matter how much I tell the client to directly connect to machines on my network, there is still a remote connection overhead of the client dialing home in order to locate the box it is attempting to connect to. This overhead seems to have gotten worse causing the audio/visual hiccups to come more often. I’ve done everything that I can think of to try and remove issues from the chain… but the end result is that I am less likely to play “serious” games from my laptop leading me to spend way more time at my desktop… which is also doubling as my work computer in the time for remote work.

I am not sure exactly what lead me to stumble onto Moonlight last night, but stumble onto it I did. I remember hearing about this project at one point in the past but never actually getting around to testing it out. Effectively Moonlight is an open source client that takes advantage of the Nvidia Gamestream tools built into GeForce Experience and the Shield infrastructure to allow you to stream games to lots of different platforms. Officially Moonlight has the following clients:

  • Windows
  • MacOS
  • Linux
  • Steam Link
  • Raspberry Pi and other SBCs
  • iOS and Apple TV
  • Android
  • Amazon FireOS
  • Google ChromeOS
  • PlayStation Vita (homebrew)

Essentially it supports all of the platforms that Parsec does with a few more thrown into the mix. For awhile I had been contemplating trying to build a set top box that would let me stream Parsec to a television but never got around to it. Moonlight however apparently just straight up natively supports the Android TV box that I already own. I will have to hook up a controller to it and test this out more closely to determine how well it works.

One of the challenges with Moonlight is that it is quite a bit more fiddly than Parsec. Essentially Parsec involves setting up an account, installing the client on two machines and then adding one as a host and then you are up and running pretty quickly. Moonlight requires you to have GeForce Experience installed on the host machine and then going into the client under the Shield section and toggling on Gamestream (which requires a GTX 650 or newer graphics card). If you are an AMD graphics card user, it requires more fiddling and apparently the OpenStream platform installed on your host machine. After you have Gamestream turned on, your machine should be findable as host in any Moonlight client that is installed on your network. There is a handshake that requires you to have access to both machines that is reminiscent of bluetooth pairing. When you attempt to connect the first time with Moonlight on a new platform it will show a short code and that code will have to be entered on your host machine in order to verify access.

From there you will be presented with a list of the games that GeForce Experience thinks you have installed on your machine. You might have to manually add games if they don’t show up, or just do what I did and configure a windows app to open… which effectively allows me desktop access to the machine. I did MSTSC.exe because it seemed fitting given that is the Remote Desktop client, but you could just as easily configure it to open Notepad.exe because the end result is it giving you access to the desktop. From there it works just like a normal remote desktop session and you can launch any games you might have on your host system. The individual game shortcuts seem to work pretty well as it will connect you and then automagically launch that specific game.

net stop NvContainerLocalSystem && net start NvContainerLocalSystem

I did end up needing to create a batch file with the above command in order to sort of “reset” the system if anything goes south on the host machine. This is essentially the equivalent of going into GeForce Experience and toggling on and off Gamestream. The default key combination for disconnecting from a Moonlight session is shift+alt+clt+q. However this morning while trying to take screenshots of how the Moonlight process worked, I stranded a session forcing me to run the batch file to disconnect and restore things back to normal. I did notice one of the pieces that did not get restored was my audio settings, so I had to go in and manually flip things back to speakers. Again Moonlight is way more fiddly than using Parsec which more or less just works.

Another thing that I encountered last night is when I first attempted to connect in remote desktop mode, I ended up getting a 4k window with a tiny 1080p window up in the corner. After some googling and messing about I found that I needed to go into the Nvidia Control Panel and change the Desktop Scaling settings. Since I run in native 4k mode while at the machine it doesn’t really do anything, but while remotely connected it takes the 1080p version of the desktop and blows it up full screen granting me easier access to it.

So at this point you are asking yourself… Bel why the hell would you go through this much trouble when you yourself have admitted that Parsec just works easier? Because running Moonlight was the best version of remote game streaming that I have ever experienced. Like I have long said that Parsec when it is working well is like sitting at the machine and controlling the games… but that is a lie. Even when it is working best, there are always some telltale signs in games that I am connected remotely and streaming. Last night while playing through Moonlight it legitimately did feel like I was upstairs playing at the keyboard when instead I was down on my laptop. I played a bunch of different games last night but at some point during the evening I started playing Generation Zero. The above screenshot is taken from the laptop of the game client running over Moonlight and there is no artifacting going on in the rain.

I played quite a bit of Outriders as well and it was so smooth and responsive. Like I think I had just gotten used to the subtle lag that Parsec added to the gameplay experience and don’t get me wrong… Parsec was better than anything I had tried up to that point. Moonlight was just a whole other level of smoothness and I think I could even probably do competitive modes in Destiny 2 through this connection. I remember it lagging three times during the entire night and even then it was only for a second before immediately returning control. I am not sure if Nvidia Gamestreaming has something built in to handle this, but it felt like the game just paused for a second before giving me access again rather than the game continuing running in the background and then having to deal with overcorrection by my character continuing along whatever path they were moving before the lag.

As much as I have loved Parsec these last three years, I think I might have a new main squeeze. As I said before these screenshots were grabbed via Fraps that just happened to be running on the laptop since I used to use that for game capture. Parsec does this thing where it intercepts a number of buttons and keeps them from being intercepted on the client machine, which is what stops voice chat from working but this also stopped me from capturing screenshots of what the Parsec client looked like performance wise. So unfortunately I don’t have any good examples of it artifacting out on me, but I am hopeful that maybe just maybe I can get voice chat working once again while on my laptop downstairs. This has been a huge source of disconnection for me because so often when I am just wanting to chill out I am on the laptop which prevents me from using it.

I will of course keep sharing my thoughts as I get used to Moonlight. I want to try a number of the other platforms like streaming games from my phone with the client. I will obviously report my findings in later posts.