Home Page
Posted By: u-man How does the hardware of vector-games work? - 09/29/15 11:39 AM
Hello guys,

I allready asked questions about the hardware of vector games and how it works, in the shoutbox. Many things where told, many where interesting, but i still have no absolute (or proofed with a source) answer. Specifically the "overbrightness" in some games is relevant for me (i.e. bullets of Asteroids, Deathstar explosion in Star Wars etc.). Initially i asked my german arcade-zone.de friends (pro-collectors and best scene in germany), what they know about the hardware and how this "overbrightness" really works. They came up with a so called "Z-Amplitude" as a answer :



Question is, if this thing is true, is it emulated in MAME and if it is, where i would find it?

Asking the same here, came up with different answers. People explained that vectors are drawn multiple times to achieve the "overbrightness", which nobody can approve at arcade-one.de to me. So i am in the misery, for the right answer. Is there anywhere a source to read about the multiple drawing of the vectors?

I am assuming that this "overbrightness" is not emulated in MAME or at least not noticeable enough. I am asking all these questions, because Jezze maybe could change some stuff, but wants to do it properly and hopefully just with HLSL.

I came up with the idea of a clipping filter, to keep brightness of vector games within a range and spare some heap for the "overbrightness", but we need to know if it is even possble to retrieve "overbrightness" info inside of MAME. So this thread is more a preparation for upcoming work, to achieve the desired result. Any expert answers about the vector renderer in MAME are welcome here.

Jezze allready did some changes to vector games, bringing back some functions of HLSL to vector-games. You can read about it here: https://github.com/mamedev/mame/commit/062e6e0383050546656dfed7261273a2d7d142a4
I am not sure if he added HLSL shadowmask options, for colored vector games (yes, they have one), but if not, it is for sure in the "upcoming next" agenda wink .

These pictures are taken from a original Star Wars, where you clearly see the shadowmask:





I know that we cant emulate vector-games to look like the originals (with current state of MAME that treats the screen of vector games like a raster screen), but it would be nice to "simulate" it as best as possible, without altering any code that function properly and behaves like the original. So the question is, how far we can get with alternatives like shaders and so on. I am open to any nice ideas and suggestions here. Jezze cant rewrite the vector renderer, but we have hope that there is stuff, that could be useful for shaders.

I think thats enough for a initial and good starting thread.
Thanks for your time reading it wink
The "Z amplitude" is the brightness of the video signal, as we attempted to explain to you previously.
Correct. The "Z Amplitude" signal is just a clever way of saying the brightness of the beam. In actual fact, I think this may be a combination of four things:
- The vector system assuming everything being drawn is a line
- The vector system's internal anti-aliasing working against us
- The vector system not handling intensity in an ideal way
- The handling of vectors in d3dhlsl.c and the vector shader need to be fixed, ideally by Jezze

Here's a pastie link containing a dump of each point that's written by avgdvg.c in one frame of Asteroids, with a player's shot on-screen: http://pastie.org/10450392

The first three points are the player's shot. Note the intensity: It's roughly double the intensity of all of the other vectors! So, this raises the question, why is the shot still so dar? Well, it's a combination of things.

First of all, the vector system assumes everything being drawn is a line. Check it out for yourself at the bottom of src/emu/video/vector.c. With what's currently in the list, it will produce a line of length 0 for the player's shot.

This certainly isn't going to be fatal, but it's not ideal, and it's even less ideal when you consider the fact that vectors, by default, are anti-aliased. How is our OSD-side anti-aliasing code going to account for the fact that it's a line of zero length? I would guess that it's probably not going to behave too terribly well.

I would also suggest that the way vectors are currently handled in my HLSL code leaves something to be desired. There's a gross hack in vector.fx to try to brighten short vectors, but that's a hack and should eventually be removed.

Lastly, regarding things that could be easy to fix: The core vector system in vector.c treats the vector's intensity as an alpha value, but does not attenuate the actual color value of the vectors. I bodged a quick hack into vector.c to scale down the color value based on the intensity value, as well as use the intensity value as alpha, and this is the result on Asteroids and Star Wars. Note that you have to click on the images in order to expand them, otherwise the size reduction makes it harder to see the differences: http://imgur.com/a/spPch

It does make most of the vectors rather dark, yes, but it does dramatically increase the contrast level between the half-intensity and full-intensity vectors, and I think some post-processing on top of that would really go a long way, assuming Jezze can whip together a better vector shader than what's already there (which is admittedly awful).

As for actual accuracy goes, the current theory is that the shots are made extra-bright by holding the beam in place for the shot for an extra few fractions of a second versus how vectors are normally drawn by the Atari DVG unit. This could be fixed by having vector.c know about the current time of the start and end points of the vector, and use that to infer an additional level of brightness in addition to what's specified by the hardware (which is already emulated).

The problem is that having this functionality is meaningless unless the individual vector drivers themselves start to use it, and that's a much more daunting task, because the current emulation of the Atari DVG is pretty ancient, and doesn't really play well with MAME's current scheduler system. More specifically, the Atari DVG code is a god damn cluster fuck of epic proportions: It appears to have its own internal scheduler, so to speak, and operates using its own time slicing, then reports back to the parent driver how many cycles it ran for, or something like that. As a result, the current system time as reported by machine().scheduler().time() is in fact the same for swaths of the vectors generated, and it seems like it would take a pretty massive re-tooling of the code in order to operate correctly. Multiply this by all of the different drivers that make use of the vector system, and, to quote Dog from Lock, Stock, and Two Smoking Barrels, "It's a bit more than a bit of a problem. I'd say it's the Mount fucking Everest of problems".
Vectrex is one of the reasons I started to contribute to MAME/MESS, as I was hacking a way with a C++ for Vextrex tutorial and needed a nice debug environment and still aim for that using MAME/MESS. Anyway, while doing the Vectrex tutorial rewamp converting 6809 assembler to C/C++ I read a lot of documents about the internals of Vectrex and there are loads of it at the Vectrexmuseum and in other places you'll find from the link collection there.

From that I believe that a good emulation need to render time as well as the vector and that time contributes to brightness not only to the current pixel itself but surrounding pixels which are accumulating brightness as the beam passes by. Then the pixels would have it brightness fade away as the phosfor looses its charge.

The vectrex hardware is really simple, just a D/A per axis and a beam on/off controlled by standard circuits. The speed/intensity of the beam is controlled by a counter/timer. There is also an intensity pot that when turned to max makes the beam-off movement traces to be seen aswell.

On top of that the audio circuitry is really poorly designed and get a buzz noice from the vector cirtcuitry which changes frequency depending on the vectors. I have seen at least a homebrew game using the buzz noice during vertical blanking to make music but I am not sure any original titles did that.

Having said that, I am not sure how close a good enough emulation needs to go in terms of accuracy, I just want to give some input if the vector system is to be redesigned and I would be happy to sub-contribute in the area if needed.
Yes, ideally the vector generators (and mathboxes) should be implemented as executable devices so they can be properly scheduled.

However, you can probably make the existing AVG/DVG stuff work by taking the start time and adding cycles_accumulated_so_far * clock_period to the base time and using that for timing.
First thanks, for all the info. So i will try to summarize this:

Z-Amplitude just handles brightness and there is no "overbrightness" factor. The overbrighntess just comes from the fact, that the beam is staying a tick longer on the same place.

Further we dont see bright bullets even if there is a nearly double intensity value for brightness, because the renderer treats points as lines and those points are also anti-aliased, which in turn makes the bullets look dark again.

Regarding the things that could be easy to fix: is your example in the current MAME version or is it something that you have done by yourself, just to show what is possible with a fix?

It would be a huge task to improve the vector-renderer, because we would also need to alter the roms to use the improvements, as every vector game treat the drawing differently or in some cases even have their own "timing/frame" system.

Is this summary roughly correct?
Originally Posted By u-man
Regarding the things that could be easy to fix: is your example in the current MAME version or is it something that you have done by yourself, just to show what is possible with a fix?


It's actual code that I had running, but it's a somewhat gross hack, so I'm not comfortable committing it to baseline.

Originally Posted By u-man
It would be a huge task to improve the vector-renderer, because we would also need to alter the roms to use the improvements, as every vector game treat the drawing differently or in some cases even have their own "timing/frame" system.

Is this summary roughly correct?


No, that's not correct at all. None of this requires altering the ROMs at all. It all comes down to how the Atari DVG and other games' vector generators are emulated, it has nothing to do at all with modifying the ROMs themselves. Someone proposing a solution that involved modifying the ROMs would get laughed off the mailing list.
Originally Posted By AaronGiles
Yes, ideally the vector generators (and mathboxes) should be implemented as executable devices so they can be properly scheduled.

However, you can probably make the existing AVG/DVG stuff work by taking the start time and adding cycles_accumulated_so_far * clock_period to the base time and using that for timing.


Looking at the code, I can't imagine it would be too terribly hard to do that in its current incarnation, it apparently is based off of vg_run_timer, which would simply go away in favor of an executable device interface which pumps run_state_machine, yeah?
Originally Posted By Just Desserts
Originally Posted By u-man
Regarding the things that could be easy to fix: is your example in the current MAME version or is it something that you have done by yourself, just to show what is possible with a fix?


It's actual code that I had running, but it's a somewhat gross hack, so I'm not comfortable committing it to baseline.

Ok, good to know. It looks nice and the difference is clearly visible for me. Maybe Jezze can do something with this or use it for a start. I can achieve a similar look with HLSL, but it is tremendous trial to find the right settings and it is different for every game or in other words i cant use global settings that looks the same on each game.

Originally Posted By u-man
It would be a huge task to improve the vector-renderer, because we would also need to alter the roms to use the improvements, as every vector game treat the drawing differently or in some cases even have their own "timing/frame" system.

Is this summary roughly correct?


No, that's not correct at all. None of this requires altering the ROMs at all. It all comes down to how the Atari DVG and other games' vector generators are emulated, it has nothing to do at all with modifying the ROMs themselves. Someone proposing a solution that involved modifying the ROMs would get laughed off the mailing list.

Yes you are right and i didnt mean the roms, but rather the drivers... it is clearly not my day today frown , but this is another story.
All in all, you clarified a lot for me.
Originally Posted By u-man

Yes you are right and i didnt mean the roms, but rather the drivers... it is clearly not my day today frown , but this is another story.
All in all, you clarified a lot for me.


That's correct in that case. Yeah, the drivers would all need to be updated to make use of the time-based functionality. It would, however, be possible to have an intermediate solution where there's a function that allows you to queue a time-stamped vector, and a function that preserves the old behavior so that drivers can be gradually updated to support the new functionality.

Still, it would be kind of nightmarish to rework all of the vector drivers to support it. It's something that we should really do at some point, I'm just saying that it won't be a particularly easy fix.
Originally Posted By Just Desserts
Looking at the code, I can't imagine it would be too terribly hard to do that in its current incarnation, it apparently is based off of vg_run_timer, which would simply go away in favor of an executable device interface which pumps run_state_machine, yeah?

It may not be too hard, I haven't looked at the code in ages. I was just proposing an alternative since you had and sounded disgusted. smile
FWIW, there's really only a very small number of drivers that use vectors, so upgrading them all to something better shouldn't be an insurmountable task. I can think of avgdvg, segag80v, cinemat, aztarac, and vectrex off the top of my head. Maybe one or two others.
Yeah, but I haven't looked at any of them other than avgdvg, and if they're all in the same or worse condition as avgdvg, then it's not insurmountable, but it's a real gargantuan pain in the ass that doesn't give much if any return unless Jezze and his C++-knowing cohort step up and plumb the appropriate support from vector.c upward to the OSD.
I am really sorry, but i still have some questions, that hopefully will be answered. At this point i thank you very much Just Desserts, for showing me your wisdom regarding vector-games, it helped me a lot to understand the "whole" problem.
To alleviate the problem further, i have some specific questions on your post here:

Originally Posted By Just Desserts
In actual fact, I think this may be a combination of four things:
- The vector system assuming everything being drawn is a line
- The vector system's internal anti-aliasing working against us
- The vector system not handling intensity in an ideal way
- The handling of vectors in d3dhlsl.c and the vector shader need to be fixed, ideally by Jezze.

First of all, the vector system assumes everything being drawn is a line. Check it out for yourself at the bottom of src/emu/video/vector.c. With what's currently in the list, it will produce a line of length 0 for the player's shot.

This certainly isn't going to be fatal, but it's not ideal, and it's even less ideal when you consider the fact that vectors, by default, are anti-aliased. How is our OSD-side anti-aliasing code going to account for the fact that it's a line of zero length? I would guess that it's probably not going to behave too terribly well.


Is there a reason why the vector system is assuming that everything being drawn is a line? Couldnt we filter out that cases, where a line has a length of zero and treat those cases in another way? I mean, wouldnt be a line with zero length always be a point?

Next point is the anti-aliasing thing. I experimented with the anti-aliasing and looked what is happening, if i turn it off via the mame.ini. Strangely it turned out, that it got more worse than better frown . With turned off anti-aliasing, the bullets disappeared completely. Or is there another anti-aliasing that works internally ?

No doubt, your approach would be a really accurate way of solving the problem, but seeing now how much work it would be, i am wondering if we have other alternatives that wouldnt invoke so much work (15 drivers to rewrite) ?
If there is no other way, than i guess it would be the best to start with vectrex. That way, we would have at least around 40 games that would look nicer wink .

My first thought about alternatives, was a point-shader, but i scrapped that, at the moment i found out, that a point is only drawn at a start of a line and i assumed there where points at both ends of a line. By reading this thread, i assume that even this first point, has more a technical reason and how vector-monitors work, so i think this point is not "visible" for the vector-renderer. Is this right?

Thank you for your patience, please dont be to rude with me wink
Originally Posted By u-man
I am really sorry, but i still have some questions, that hopefully will be answered. At this point i thank you very much Just Desserts, for showing me your wisdom regarding vector-games, it helped me a lot to understand the "whole" problem.
To alleviate the problem further, i have some specific questions on your post here:


Are you trying to intricately understand the problem to help you fix it in some way, or are you trying to act as some sort of information "man in the middle" in the hopes of it "becoming fixed for you".

Because if it is the latter, all you are doing is harassing people.
Originally Posted By Sharkpuncher
Because if it is the latter, all you are doing is harassing people.


Well, that's how I started out... laugh
Originally Posted By Stiletto
Originally Posted By Sharkpuncher
Because if it is the latter, all you are doing is harassing people.


Well, that's how I started out... laugh


It's fair, and I think at some point anyone who has been a fan of emulation (and its progress) has done the "oh hey, here are some pieces of information that I have collected, I will put them under my pillow and hope the emulator fairy visits" at some point. In my case, I was a dumb teenager in the 90s, and thinking that I somehow really knew how to find things on the internet compared to most, and I might have had some token contributions but nothing that wouldn't have happened eventually anyway, and I was probably a huge pain in the ass in the process.

But still. It seems to me that pressing people for information is not going to do anything constructive, and just grate on the nerves of people. Obviously if he is working on something and I am misunderstanding what I am reading then I apologize, it just sounds like the typical "I will gather all of this data together, and then my wishes will be granted and someone will make it happen" behavior. Sometimes it's sort of endearing, like how back in the day every other post from gregf was a love letter to wild gunman, but other times it just comes off as "obviously you guys are too dumb to fix this, but now I'm here to (????) and then it will be fixed" like people tend to be about sega, capcom, snk, etc.

I wonder if vector intensity will be able to be solved in way that will make those who want it most satisfied, though. The way people often go on about the intensity, the glowing, makes me think that brightness tweaks will still leave them asking about why it doesn't "do it the right way" and glow like overdriven phosphor.
Personally I'm fully understanding that we can never have the really striking look of the original CRTs, but I also know we can do a lot better.
What happened to the VectorMame?

http://www.zektor.com/zvg/vectormame.htm

I think that would be a great generic output device for vector systems, but I can't find any traces of it in the current MAME sources.
Originally Posted By Sharkpuncher
Are you trying to intricately understand the problem to help you fix it in some way, or are you trying to act as some sort of information "man in the middle" in the hopes of it "becoming fixed for you".

Because if it is the latter, all you are doing is harassing people.

Well... yes, i want to understand more of the whole thing and no, i dont expect that someone here will be fixing it, but i will not fix it either wink . It should not really matter for you, as long as something better will come out, as a result of this discussion. It should not matter, who is doing the fixes, as i already said, i dont expect that someone here will do it.

Originally Posted By Sharkpuncher
Originally Posted By Stiletto
Originally Posted By Sharkpuncher
Because if it is the latter, all you are doing is harassing people.


Well, that's how I started out... laugh

But still. It seems to me that pressing people for information is not going to do anything constructive, and just grate on the nerves of people. Obviously if he is working on something and I am misunderstanding what I am reading then I apologize, it just sounds like the typical "I will gather all of this data together, and then my wishes will be granted and someone will make it happen" behavior. Sometimes it's sort of endearing, like how back in the day every other post from gregf was a love letter to wild gunman, but other times it just comes off as "obviously you guys are too dumb to fix this, but now I'm here to (????) and then it will be fixed" like people tend to be about sega, capcom, snk, etc.

I wonder if vector intensity will be able to be solved in way that will make those who want it most satisfied, though. The way people often go on about the intensity, the glowing, makes me think that brightness tweaks will still leave them asking about why it doesn't "do it the right way" and glow like overdriven phosphor.

I dont understand what problems most of you have, with asking questions about particular problems. As i dont expect any fix from your side, i still hope for answers regarding my questions.... nothing more.
I really dont know for how much yet you can go on this way. Time will tell us. It seems you dont care at all about a "healthy" users base and the result is, you are losing a lot of potential contributors and a lot of Team Developers already quit the scene. So tell me, how your approach is helping in any way? I will answer this... it doesnt help and looking into the future, it is even contraproductive.
Why? Because younger generations, doesnt have that insight in abandoned technology, that most of you assume that they should have. So asking questions, how something "that old" works, cant be bad at all, contrary to your theory, even if i would only sit and wait for a fix (which is not true).

I have enough braincells to know, that something like this: https://www.youtube.com/watch?v=qv3IPW2qzQc cant be satisfying solved with the current technology of LCDs.... i would even say it is impossible, but i share R.Belmonts opinion:
Originally Posted By R. Belmont
Personally I'm fully understanding that we can never have the really striking look of the original CRTs, but I also know we can do a lot better.

Comparing MAME vs. "other" emulators regarding vector-games, there are some examples that seem far ahead, when it comes to "look and feel". Thats why initially started this thread and I think there is hope in changing this wink .
Originally Posted By Edstrom
What happened to the VectorMame?

http://www.zektor.com/zvg/vectormame.htm

I think that would be a great generic output device for vector systems, but I can't find any traces of it in the current MAME sources.


VectorMAME only worked on MS-DOS PCs with legacy parallel ports and only with the Vectrex as a practical matter, because who'd want to deprive an otherwise working vector cab of its monitor.

Someone's recently come up with a similar USB solution that works on modern PCs and operating systems, but it *only* works with the Vectrex and there aren't that many of those out there in good condition. (I have one, but I don't really want to screw with it if I don't have to).
Also the ZVG cards are not produced any more.

I kept your link R. Belmont, with the USB-solution: https://trmm.net/Vectrex
I actually have two working Vectrexes, but I wouldn't scrap one either to interface to MAME over ZVG & co, I rather make a custom cartridge that is connected to my PC which turns the whole Vectrex into a vector engine.

A ZVG like device could be used in one of these laser projects coming back too:
http://hackaday.com/2014/09/20/vector-laser-projector-is-a-lesson-in-design-processes/
http://hackaday.com/2013/03/12/playing-mame-games-on-a-rgb-laser-projector/
http://hackaday.com/2011/11/10/rgb-laser-projector-is-a-jaw-dropping-build/

Once I get a better hang of the full MAME code base I might contribute something in this area but it is rather far down my todo list and right now I hate C++ (again) and device clocks that doesn't start and stop as expected so we might talk about in a few years from now. ;-)
Originally Posted By Edstrom
I actually have two working Vectrexes, but I wouldn't scrap one either to interface to MAME over ZVG & co, I rather make a custom cartridge that is connected to my PC which turns the whole Vectrex into a vector engine.


This sounds like a very interesting project. I would be one of the first customers willing to buy such a cartridge smile wink .
Originally Posted By u-man

This sounds like a very interesting project. I would be one of the first customers willing to buy such a cartridge smile wink .

I am not intending to manufacture anything but maybe I post my findings for others to make their owns if found usable.
Its still a fantastic idea and hopefully do-able for you. I wish you all the best, regarding this smile .
So I got drunk and scrawled out a description of one part of the core overhaul necessary to properly support perceptually accurate vectors in MAME. Have a look, tell me what you think: https://github.com/mamedev/mame/issues/399
The description sounds very pragmatic and seems to make it easier/possible to render vectors with different brightness and also possible to render adjacent pixel glow?!
Thats a great improvement and will give vector games a more vivid appearance.

Without any deeper knowledge of the MAME sources I probably fail to see the full effect of the proposal, however if I just go to the Vectrex hardware which I know fairly well
I believe it will be difficult to render the ghost vectors of the beam zeroing as mentioned previously, these are sometimes non-linear and I would guess that is caused by the vectrex decharging some capaci/induc-tance in the deflectors.

There are also some home brews that took the Vectrex hardware, no modifications except for the cartdridge, to the extreme:
- http://hackaday.com/2015/04/24/extreme-vectrex-multicart-plays-bad-apple/
- https://www.youtube.com/watch?v=_aFXvoTnsBU

Vectrex actually has a preferred fps of 30 refresh cycles per second and the BIOS has a function to zero the beam and wait for the next cycle. The reason is that the beam only lives for so long and lower fps will make the screen flicker. The demo above uses the blank interval to play music on the deflector circuitry with the beam off.

Chew on that for a while, maybe it is too much to expect of an emulator.
Hmm. Good point on the vector beam having behavior that isn't directly controlled by the software. I'd imagine other vector monitors have other, similar things going on?

I wonder if another layer of abstraction, like a vector monitor device (C++) that could be inherited from for different vector monitor types would be able to handle this. Every time the beam turns off, the device would track its motion and create a series of short vectors whose intensity would decay at the necessary rate. This would mean the driver wouldn't have to take care of things that are a property of the monitor - it would come "for free" from the monitor device.

Maybe smile
/Andrew

PS. No level of accuracy is too much to expect of an emulator!
I like that, if we had an "analog" monitor device with some properties that could be driven by applying voltages to the deflectors and the beam it would be easy to map that to the actual D/A:s of the Vectrex for instance.

It would also be easy to multiply that to get an colour vector monitor.

I am not sure of the feasibility of emulating an analogue beam however but I suspect we talk about relative slow/few movements compared to a normal sweep based CRT.
I'm not so sure about having a device specifically for "analog" vector monitors, as to do so would effectively be making some broad assumptions about the behavior of such monitors. Specifically, it would seem to imply that all vector monitors driven by some sort of analog timing circuit behave the same, and I doubt that that is in fact the case.

The circuitry for handling the deflection is something that I feel would be better off being handled on a per-driver basis, because such circuitry would likely be unique to a given machine and not easily made generic.

In the case of the Vectrex, for example, I feel that in an ideal case, the driver would use the netlist system to emulate the analog components attached to its DACs, and then from the output of the netlist circuitry, pass along the appropriately time-stamped vectors to the core vector rendering system.

Also, for the record, my proposal in that issue will not get us "proper" vectors, it is just one step of several in that direction. I will be writing up separate issues for all of the individual steps necessary to get proper vector rendering.
Makes sense. The beam, deflection principles and different types of phosphor however are entities in all CRT devices but the circuitry interfacing to the surrounding world can differ a lot.

A VGA monitor would have a sweep generator and expects sync pulses and such but also the ID0 to ID3 and later DDC lines for monitor detection.

A TV would have an RF input and many consoles and home computers had an RF modulator either build in or as an accessory.

Going further into the TV world there are numbers of special schemes involving serial protocols in the SCART connector and later in the HDMI connector, not to mention the VBI lines with Closed Caption and Teletext information as well as copyright, macrovison and WSS signalling. All these features requires intelligence in the display device for OSD etc. and they eventually included an MCU to deal with all this long before SmartTV become a thing.

I think we could view the display device as something that may be emulated more exactly at some point and there are certainly ROM:s to be dumped and PCB:s to be described in these as well. They should be a slot device as much as anything else in MAME imvho.

Vectrex happens to be one of the oddballs here, and possibly many of the arcade games too, where the machine and the display is tightly integrated.

But I like to keep it simple too smile
First, this new answers are all very interesting and may are the most accurate way i can think of, to reach proper simulation of vector monitors. Sadly this is beyond of any possibilty to help you or others from my side frown .

Long before i did start this thread, i had a nice discussion on a arcade-convention with guys that owned the originals at their homes and man-caves. We discussed what MAME could do, to improve the appearance of vector-games. Obviously the first thing, was the mimic of the beam and the phosphors, but it also was the first thing what we know, it would be very difficult to do, till the point "maybe impossible with satisfying results". So we focused on other things, that maybe could be done, with less effort.

These things came out:

1. seeing the points from drawn lines. (you could do this even with old HLSL, but the picture would be very dark, with unnatural thin lines) it would look like this:
Click to reveal...



With the latest changes from Jezze, it is possible to show the points more clearly and with enough brightness. I now have more the problem, to show them colored for colored games (i.e. a bright red point for a red line). Everything is fine with b/w games.
Click to reveal...



2. A glow that doesnt look to "artificial". You can see a good example of what i mean, if you look at the deathstar explosion of Starwars. The glow should have a circle/donut form, but looks more hexagonal instead.

Click to reveal...



3. Some "wobble/shake" filter to mimic the "stable" smile screen of vector games. This gets really extreme on the cheap hardware of a Vectrex. Just watch this video at 15min. : https://www.youtube.com/watch?v=12juB-ySTWo
See how the whole screen/playfield is shaking laugh ? It gets even worse with Clean Sweep. Same stuff happens with the arcade stuff, but way less and not to that degree.

I am not sure, but such a filter could maybe also help the phosphor filter. It seems that the phosphor filter applies only to objects in motion (ok, obvious), but it has downsides for some cases. If you look at the scrolling text in StarWars and use some higher phosphor settings (0.60-0.80), you will see that the phosphor effect turns off, at the moment where the text comes to halt. This looks very strange and i guess it wouldnt look so strange, if we would have a wobble/shake filter, as the text would be still in "little" motion and the phosphor filter wouldnt stop working abruptly.

StarWars Logo with no motion:
Click to reveal...



StarWars Logo, with motion:
Click to reveal...



All the images here, are already scaled down and not full quality (i.e. you barely see the shadowmask etc.), but i hope they are good enough, to explain the examples properly wink .
The Death Star explosion *is* visibly hexagonal on hardware, which makes perfect sense given how vector games work smile
I actually have no well-founded knowledge about vector monitors, but I had a look into the vector renderer and made some visual improvements (#432) that you might enjoy.

Originally Posted By Just Desserts
The handling of vectors in d3dhlsl.c and the vector shader need to be fixed

I removed a multiplication of the alpha by 2.0 in the vector shader, which made the lines above an intensity of 127 (alpha of 0.5) completely in-transparent and therefore full white/r/g/b. I don't know about the initial intention of this multiplication, but unless a non-normalized floating point texture is used the sampling will always return ARGB values in a range of [0.0; 1.0].

Originally Posted By Just Desserts
The vector system assuming everything being drawn is a line

This is still the case. However, I've changed a point to be no longer a zero-length line but a line with the length of the half beam width, which makes the line look like a square instead of a flat rectangle. This admittedly is some kind of hack-ish workaround until a more suitable solution would be implemented.

Originally Posted By Just Desserts
The vector system not handling intensity in an ideal way

I came to the idea that the intensity of a beam should not only affect its brightens but also its width. So I defined the beam width by a linear slope between a configurable minimum and maximum and the actual beam intensity. An optional exponential factor weights greater intensities more than lesser intensities. I'm not sure if this is the right approach, but the results of the post processing speak for themselves.


Here is a comparison of the changes.




Here another example of Star Wars. (starwars.ini)


(click to enlarge)
I made a very small change to vector renderer and would like to know what you think about.

This is how vectors look currently. (oversized beam width to better show the difference)



And this is how vectors look after my small change; especially the junctions of the vectors.

The beam shouldn't be so sharp and square - it doesn't look realistic at all either way.
You're right and I absolutely understand, that the vector renderer does not produces a realistic rendering with or without these changes.

Furthermore I think that a realistic rendering can not be achieved with simple polygons to draw lines, but this is how it currently works. And I try to improve what currently works.

So for now it is out of question if it looks realistic. The question is does it appear more correct than before?
Realistic vector rendering will need everyone to have HDR monitors, so it's a ways off yet.
Or real vector monitors.... :-)
CRTs are no longer being manufactured, so good luck with that.
Originally Posted By Jezze
You're right and I absolutely understand, that the vector renderer does not produces a realistic rendering with or without these changes.

Furthermore I think that a realistic rendering can not be achieved with simple polygons to draw lines, but this is how it currently works. And I try to improve what currently works.

I've often thought you could create a texture, or these days a shader, that would create less of a boxy solid rectangle look and more of a bright middle with a smooth alpha falloff and rounded endcaps to improve the overall look. At least that was my theory. smile
There is already some example in bgfx that show this kind of shaders in action and this is preview of it :

Not a big fan of faux-blooming and artificial saturation, but otherwise that looks cool.
Originally Posted By AaronGiles
I've often thought you could create a texture, or these days a shader

There is some old code in the D3D renderer, which does a texture mapping on the vector-line polygons, but I don't know if it ever worked. Maybe I can get it to work.

Originally Posted By Micko
There is already some example in bgfx

This basically the same what can be achieved with the HLSL post processing.
Micko that look cool. smile
Originally Posted By Jezze
Originally Posted By AaronGiles
I've often thought you could create a texture, or these days a shader

There is some old code in the D3D renderer, which does a texture mapping on the vector-line polygons, but I don't know if it ever worked. Maybe I can get it to work.

It never worked because I'm the one who hacked it up originally. smile

I quickly realized that a single texture wouldn't really work because it gets stretched over arbitrary lengths. But a shader could in theory do it right.
Beside that most of you didnt realize that the change has nothing to do with (faux)-bloom or any other post-fx and that it rather has to do with the drawing of the vectors itself, shown in a very exaggerate way (without post-fx) to see the change, I would say it looks way better than before.

Previously the vectors looked like single matches that build a object and with the change, it looks more like a homogeneous/seamless vector build object, which in turn looks more realistic IMHO.

Someone could say the change is so minor and you wouldnt see it in "normal" usage anyway, but this would be a wrong decision. A small "bug" with post-fx on top, will change the final picture in a noticeable way, like it did often in the past with HLSL prior v.160 of MAME.

I told you once, that a time will come, where developers cant really compare abandoned technology (like vector-monitors in this case) with their own code. So a feedback for Jezze that help him is more appreciated, like the described issues from JD/MG on Github or the described Vectrex mechanics by Edstrom. Jezze needs/wants feedback, apart from "good/bad job", rather one that says "we tried something in the past, but abondened it, because of this and that, so lets look at it again and find a solution."

You can of course, sit and wait for HDR or any other upcoming technology. There is no question that HDR will improve the appearance of vector-rendering, but how much is still a theory until its proved.

One of my first real job as a programmer was to convert HPGL vector based plotter file formats to a Sixel format that could be sent to one of the first ink jet printers. When the job was delivered one of the first feedbacks were that the width of the lines were too thin due to the high resolution of the printer. However, they couldn't accept that the price for making the line wider was higher than the $200 that I received for the total job so I never completed that task.

If I had done it it would have looked similar to Jezzes pictures and at the time I was struggling with the corners and how to make them round and avoid the overlapping areas of the lines. The dots (overlapping areas) in the corners with higher intensity is needed by the Vectrex however.

An improvement I would think about is to draw the vectors as filled trapezoids to avoid the overlapping areas and then the dots in the corners separatelly as filled circles with the correct diameter (== vector line width) and intensity that is a factor 2 of the vector intensity. I admit I have no clue what goes in the shader and what goes in the vector renderer in the MAME graphic pipeline but that is my feedback at this point.
Thanks for your feedback.

The vector renderer currently only knows about lines, that have a start and end coordinate, an intensity and a width; points are also zero-length lines. And these lines are renderer by D3D, OpenGL and GDI in quite different ways. D3D renders rectangles, OpenGL renders simple lines and GDI renders parallelograms which is quite odd. (I think BGFX handles them like D3D.)

I also think that it would be an improvement if we were able to render single points as circles and lines as rectangles with rounded ends.

The approach of Aaron Giles to implement a texture mapping could also be continue.
© Forums