bsnes v0.038 released

Archived bsnes development news, feature requests and bug reports. Forum is now located at http://board.byuu.org/
Locked
creaothceann
Seen it all
Posts: 2302
Joined: Mon Jan 03, 2005 5:04 pm
Location: Germany
Contact:

Post by creaothceann »

FitzRoy wrote:Battle Blaze's title screen
Is it just me, or is that screen very shaky in ZSNES? :?

EDIT: Thanks for the answer.
Last edited by creaothceann on Sun Dec 21, 2008 3:11 am, edited 1 time in total.
vSNES | Delphi 10 BPLs
bsnes launcher with recent files list
byuu

Post by byuu »

Looks like Firefox 3.0 doesn't like :after { content: } much, either. Fine, I'll use <span>.

Hmm, I think I was just being lazy by not ignoring the color table stuff for values of 1.0 (100%). I'll take a look at it.
Battle Blaze's title screen -- Is it just me, or is that screen very shaky in ZSNES?
That game triggers H-IRQs on every single scanline (guess they got one of the dev manuals that was missing the HDMA section). If timing isn't near-perfect, you'll either miss IRQs or trigger extra ones, which makes the intro screen go crazy.

In-game programming is shit, too. There's a lot of graphics corruption at the top of the screen when choosing who to fight. Will only show up with a cycle-based PPU renderer.
byuu

Post by byuu »

Fixed the website again for IE6,7. Also added a bit more style touches (small-caps, hanging punctuation for unordered lists, etc).

Thanks for the algorithm fixes, Verdauga. I don't understand the contrast one, but it works so I don't really care :)

Not really in the mood to screw with the BGR555<>RGB888 conversion at the moment, nor am I too worried about loss of precision by not using floating point everywhere. I doubt many people modify that stuff from the default anyway.

Lastly, even though it isn't needed anymore, I made it bypass the update() functions when contrast/brightness/gamma are at their default states. At the very least, we don't have to worry about future mistakes like FitzRoy observed.
I've been looking at the default color settings and to my eyes, a contrast setting of 100 alongside the gamma curve looks the punchiest. How come your default is 80?
1. I set that when I only had 16-bit color ... so colors 0-2 all got crushed to black from the truncation (gamma ramp is an exponential curve.)
2. Cheaper CRTs still crush the low-end of the spectrum too much, so people were complaining about not being able to see details in eg Chrono Trigger Lavos' ... shell? Cocoon? Meh.

That's a good idea, though. That's one of the last strings with hot dog buttons. How about we add a third preset value (gamma ramp + gamma = 100%), and in turn I'll remove the invert colors option? Three check boxes above three buttons.
Verdauga Greeneyes
Regular
Posts: 347
Joined: Tue Mar 07, 2006 10:32 am
Location: The Netherlands

Post by Verdauga Greeneyes »

byuu wrote:Thanks for the algorithm fixes, Verdauga. I don't understand the contrast one, but it works so I don't really care :)
Thanks. I just shuffled the variables around really, and used a trick blargg taught me to use integer rounding. No such luck for gamma_adjust though. (even if using the integer version of pow() helped in some way, there's no way to avoid floating point here that I can see)
byuu wrote:(gamma ramp is an exponential curve.)
I haven't plotted it yet, but out of curiosity, where did you get the values for it? It might have been mentioned at some point in the monster thread and I just forgot ...

On a vaguely related note I've been looking into color management/correction today. Dear lord that stuff is complicated. Apparently the coefficients you use for sepia and greyscale are actually Luma coefficients. The older recommendation used luminosity coefficients but devices applied gamma correction at the wrong time (before rather than after), and so they changed the coefficients rather than changing the order, creating a new flavour of Y’CBCR. What I -haven't- been able to find is the 'old' set of coefficients, and whether we should be using those or the new ones. I don't know for sure when gamma is applied (bsnes settings aside) in this case although I think PCs do it in the right order, which implies we should be using the luminance coefficients.

And that doesn't even touch on actual color management, which for best perceptive quality should be based on CIECAM02 (Windows Vista supports this, apparently) with a -valid- ICC device profile for your monitor.. (sRGB can be used as a fallback) Although I downloaded a paper on the subject by the ICC, I need to rest now x_x
FitzRoy
Veteran
Posts: 861
Joined: Wed Aug 04, 2004 5:43 pm
Location: Sloop

Post by FitzRoy »

byuu wrote: 1. I set that when I only had 16-bit color ... so colors 0-2 all got crushed to black from the truncation (gamma ramp is an exponential curve.)
2. Cheaper CRTs still crush the low-end of the spectrum too much, so people were complaining about not being able to see details in eg Chrono Trigger Lavos' ... shell? Cocoon? Meh.

That's a good idea, though. That's one of the last strings with hot dog buttons. How about we add a third preset value (gamma ramp + gamma = 100%), and in turn I'll remove the invert colors option? Three check boxes above three buttons.
See, I don't think the multiple choice thing works because users don't know what you just told me by looking at these buttons. I think it would be more beneficial to have a single agreed original, against which people with special needs can make changes to. I mean, if we're going to create presets with codenames for every monitor problem, what was the point of allowing anything beyond those buttons? You can definitely get away without any buttons, the effects of each option are too apparent for users to lose home.

It would also be nice if the ranges were modified to make the defaults perfect middles so they're obvious.

-95...0...95
-95...0...95
5...100...195
Panzer88
Inmate
Posts: 1485
Joined: Thu Jan 11, 2007 4:28 am
Location: Salem, Oregon
Contact:

Post by Panzer88 »

I think what byuu suggested would work just fine, but it'd be good to put a little tutorial or such in the readme. Would that be too much work?
[quote="byuu"]Seriously, what kind of asshole makes an old-school 2D emulator that requires a Core 2 to get full speed? [i]>:([/i] [/quote]
byuu

Post by byuu »

See, I don't think the multiple choice thing works because users don't know what you just told me by looking at these buttons.
Yeah, there's no room for adequate descriptions. 1-3 words just can't capture it all. You'd have to click them, see what they look like, and then decide if you like it.

Really need something there, three floating checkboxes on their own look kind of tacky. Hmm ...

Might look better the old way with four options, 2-per-line, and no buttons at all.
I think it would be more beneficial to have a single agreed original, against which people with special needs can make changes to.
Well, I like gamma ramp + 80% gamma, myself. 100% gamma has more emphasis, but we got complaints in the past with that for crushing some details. Should we try again now that bsnes uses 24-bit output? Won't help with cheaper monitors in bright sunlight.
It would also be nice if the ranges were modified to make the defaults perfect middles so they're obvious.
Agreed.
Last edited by byuu on Mon Dec 22, 2008 2:45 am, edited 1 time in total.
sweener2001
Inmate
Posts: 1751
Joined: Mon Dec 06, 2004 7:47 am
Location: WA

Post by sweener2001 »

then people are expected to read, and nobody knows how to read
[img]http://i26.photobucket.com/albums/c128/sweener2001/StewieSIGPIC.png[/img]
Verdauga Greeneyes
Regular
Posts: 347
Joined: Tue Mar 07, 2006 10:32 am
Location: The Netherlands

Post by Verdauga Greeneyes »

byuu wrote:You'd have to click them, see what they look like, and then decide if you like it.
Speaking of which, could you make it so that the frame redraws itself when you change an option (with a slight delay for the sliders I imagine)? I noticed when I was playing with this yesterday that you can activate the main window while the settings window is open to see the change, but this is hardly intuitive. (of course this only applies if bsnes is set to pause when the main window is inactive - although personally I think it's a bit unintuitive that it doesn't always pause when the settings window is open. It does pause when you click the menu, but not when you leave it to look at the configuration panel ... It would make more sense if all options (aside from the driver ones) applied immediately, but still a bit inconvenient in my opinion)
FitzRoy
Veteran
Posts: 861
Joined: Wed Aug 04, 2004 5:43 pm
Location: Sloop

Post by FitzRoy »

byuu wrote: Well, I like gamma ramp + 80% gamma, myself. 100% gamma has more emphasis, but we got complaints in the past with that for crushing some details. Should we try again now that bsnes uses 24-bit output? Won't help with cheaper monitors in bright sunlight.
I actually think that was me. Now that I have a better monitor, I think my previous assessment was incorrect. I'd like to do some more A/B testing with Verdauga's fix in, though, to be sure. If you could post a new WIP, I'd appreciate it.
tetsuo55
Regular
Posts: 307
Joined: Sat Mar 04, 2006 3:17 pm

Post by tetsuo55 »

Byuu i would like to request support for the following audio renderers.

XP: ASIO and/or Kernel Streaming (this bypasses the very bad kmixer resulting in better audio quality)

VISTA(and windows7): WASAPI exclusive, this is vista's version of kernel streaming, but unlike asio and kernel streaming it works for every soundcard, some more info here: http://forum.doom9.org/showthread.php?t=143659
byuu

Post by byuu »

New WIP.

- defaults are now centered for video settings panel sliders, modified default gamma to 100 with gamma curve enabled.
- removed all the preset buttons, it looks terrible with just one.
- fixes 99% of the useless bullshit warnings with GCC 4.3, still didn't change all the "%0.2x->%.2x" strings in the disassemblers though.
- fixed up the double->nall::string conversion, but it still has some rounding issues, so I can't use it yet. About ready to just implement that as a wrapper around sprintf.
Byuu i would like to request support for the following audio renderers.
Sure, you can post them here when you're done and I'll include them in the source. If they don't cause missing driver errors on a clean install of Win2k SP4 or newer (this is why Win/OpenAL is disabled), then I'll enable them in the default binary as well.
tetsuo55
Regular
Posts: 307
Joined: Sat Mar 04, 2006 3:17 pm

Post by tetsuo55 »

Touché

I hope someone who knows C++ and these api's can help out (because i don't)
Unlike linux we windows users are still stuck with less than accurate sound
_willow_
Hazed
Posts: 51
Joined: Mon Dec 24, 2007 2:03 am
Location: Russia
Contact:

If they don't cause missing driver errors ?!

Post by _willow_ »

byuu wrote:Sure, you can post them here when you're done and I'll include them in the source. If they don't cause missing driver errors on a clean install of Win2k SP4 or newer (this is why Win/OpenAL is disabled), then I'll enable them in the default binary as well.
OpenAL drivers are the part of drivers for creative(tm) audio cards. Moreover they don't cause missing driver errors. You blaming drivers after using ALUT c**p in bsnes while all the kind world load the OpenAL32.dll explicitly and bind the externals for smarter controls. I gave you the fully working Win/OpenAL renderer the ages ago without any dependencies except bsnes ones, just plane cpp and header for the Win/OpenAL and X-Fi stuff out of the box. I did plenty OpenAL renderers before and never had the issues you experienced.
If you miss my old PM, that is not an reason i'll check for what i had the latest bsnes of mine. It can be a bit tricky to downgrade them to cut off hardware multichannel rendering, removing volume controls, etc.

It would be great to shortcut the software resampler and output the output stream directly to hardware voice of the audio card (In case we are speaking about game audio card). The data comes through the hardware voice whatever you think it doing and it absolutely have no requirements for client side resampler.
[url=http://quake2xp.quakedev.com]quake2xp[/url] audio engineer
tetsuo55
Regular
Posts: 307
Joined: Sat Mar 04, 2006 3:17 pm

Post by tetsuo55 »

Oops forgot about OpenAL

_willow_, OpenAL allows direct to soundcard streaming like kernel and asio?

In case of the X-Fi the best signal to send it is 32Bit/96khz for surround and 32Bit/192khz for stereo
This way there is 0 internal resampling in the X-Fi, however the DAC is 24bit so there is a bitrate change there.

Maybe audio mode allows 24>24, the other modes do ANY>32>24,
Bitrate is probably padded but all internal calculations are done in 32bit and then downbitted to 24bit, this to make sure there are less rounding errors in the 24bit stream.

Software resamplers built into bsnes could be better as these can work with 64bit or higher precision
_willow_
Hazed
Posts: 51
Joined: Mon Dec 24, 2007 2:03 am
Location: Russia
Contact:

Post by _willow_ »

I wish the folks drop exclusiveness paranoia at once!

The PC audio is mean to be resampled & mixed after all. That is much better compared to exclusive modes restrictions. Do you remember times of the single *ding* by windows blocked all the audio output? The WASAPI thing is the best api i worked with so far. Not because it can work in exclusive mode but for his clever shared mode. The best thing for WASAPI as you MUST ask for what damn native mode i should resample my audio to got the product that is guaranteed to not be damn resampled again.

ok back on topic.
tetsuo55 wrote:_willow_, OpenAL allows direct to soundcard streaming like kernel and asio?
Somewhat alike, it allows direct soundcard streaming for onboard DSP processing. Kernel mode do the same. ASIO tends to shutdown everything except ASIO rendering path, if it's not, then it's not ASIO really.
This is the general case of cource as you cannot say for sure for PC is it a direct path or wrappers up to wrapper on top of wrapper mixer for something virtual.

Well, tetsuo55, i do not want to ruin your parade but there is no "0 internal resampling in the X-Fi". This sucker chip do something in any possible mode. To color things even more sorrowfull, Ogre drivers for X-Fi's are not designed for humans needs. Do not fool yourself just go for professional cards and ASIO. Personally i do have X-Fi Elite Pro for games and Juli@ for music, because X-fi is not designed for professional needs. X-Fi even sucks at entertainment needs :(
[url=http://quake2xp.quakedev.com]quake2xp[/url] audio engineer
tetsuo55
Regular
Posts: 307
Joined: Sat Mar 04, 2006 3:17 pm

Post by tetsuo55 »

te best way i have found currently to avoid most of the resampling is

Pre-resample to 32bit/96khz, then asio to soundcard which outputs 24/96 to the dac.

It sounds a LOT better than directsound.

On another topic
Has anyone tried bsnes on a core i7 yet?
_willow_
Hazed
Posts: 51
Joined: Mon Dec 24, 2007 2:03 am
Location: Russia
Contact:

Post by _willow_ »

tetsuo55 wrote:In case of the X-Fi the best signal to send it is 32Bit/96khz for surround and 32Bit/192khz for stereo.
For my knowledge that is absolutely correct.
That's the way the internal DSP mixer works. It resample everything up to 32Bit/96khz for surround and 32Bit/192khz for stereo then resample down to DAC needs. If you feed the resampler the highest RAW bitrate possible i.e. 32Bit/192khz for stereo, it skip the upsampling, but not the downsampling part i believe. Not sure about surround 96->192 case. Moreover i think it upsample even 96khz for surround too. The problem is i do not believe the X-Fi's support 192khz natively, i mean it always scale 192khz down to 96khz.
Still a small chance ASIO shutting down resampling completely on X-Fi. As well as it missing 192khz in ASIO mode because it lacks it natively.

And you guys talking about kernel mode, lol :lol:
[url=http://quake2xp.quakedev.com]quake2xp[/url] audio engineer
byuu

Post by byuu »

Damnit, dates are still screwed up on IE6.

Cute FF2 rendering glitch, too. At certain sizes (every ~15px or so), it cuts off the right-hand border of boxes, and I just so happened to hit a boundary that causes the clipping. Oh well, going to swap the font size with EM units anyway to allow IE6 to resize the content anyway, that seems to be enough to "fix" it.

Anyone familiar with the IE6 relative { absolute } bug workaround? Everyone usually applies height: 0; he\ight: auto; zoom: 1; on each relative block. Seems overly superfluous, I'm able to get it working without affecting anything else like so:

Code: Select all

<!--[if IE]><style>* { zoom: 1; }</style><![endif]-->
Is there a compelling reason the above is a bad idea for a site that doesn't use zoom: at all, ever?

Going to use this for the title + first-letter + date:

Code: Select all

h2 {
  border-bottom: 3px double #000;
  font-variant: small-caps;
  line-height: 1.2em;
  position: relative;
}

h2 span {
  border-left: 3px double #000;
  padding-left: 0.5em;
  position: absolute;
  bottom: 0em;
  right: 0em;
}

h3 {
  border-bottom: 1px solid #888;
  position: relative;
}

h3 span {
  border-left: 1px solid #888;
  padding-left: 0.5em;
  position: absolute;
  bottom: 0em;
  right: 0em;
}

h2:first-letter, h3:first-letter {
  color: #c00;
  font-size: 1.1em;
}

<!-- we *should* be able to put the span first, but h3:first-letter won't hit the 'T' on FF / IE (Opera works fine). Doesn't matter whether date is a span (inline) or div (block) element. Meh, whatever. -->
<h3>Title<span>2008-12-18</span></h3>
Oh, and:
http://www.w3.org/TR/css3-selectors/#on ... ype-pseudo
6.6.6. Blank

This section intentionally left blank.
How professional. Wouldn't want documents using :not(X) to become possessed by Lucifer, right? Hopefully the W3C doesn't add a new clause that browsers shouldn't render CSS3 on the Sabbath next.

EDIT: hooray! Just a strange coincidence:
In the last version of this specification, section 6.6.6. defined the
:contains() pseudo-class. This pseudo-class has been removed from the
draft (due to lack of implementations)
I haven't plotted it yet, but out of curiosity, where did you get the values for (the gamma ramp table)?
It was from Overload / Super Sleuth. A fairly simple exponential increase on the lower half of the table, and becomes linear on the upper half. Don't know how he came up with it, but the effect is truly stunning.
Has anyone tried bsnes on a core i7 yet?
Sliver X tested for me. It's about ~5-10% slower, clock-for-clock, to a Penryn (eg E8400) CPU. And since the $300 i7 is 2.67GHz, vs $150 E8400 @ 3GHz ... the Penryn is a better choice.

The reason its slower is because of the anemic L2 cache on the i7 (only 256kb vs 3MB E8400.)
Moreover they don't cause missing driver errors. You blaming drivers after using ALUT c**p in bsnes while all the kind world load the OpenAL32.dll explicitly and bind the externals for smarter controls.
I'm not binding 30+ API calls through GetProcAddress, sorry. It looks like ass. Besides, when we were testing OpenAL, not a single person got better results with it than DirectSound.

In fact, short of tetsuo, everyone seems happy with just DS.
Not saying I mind adding other drivers, just that I'd need someone else to write them.
Last edited by byuu on Mon Dec 22, 2008 8:16 pm, edited 1 time in total.
creaothceann
Seen it all
Posts: 2302
Joined: Mon Jan 03, 2005 5:04 pm
Location: Germany
Contact:

Post by creaothceann »

byuu wrote:I'm not binding 30+ API calls through GetProcAddress, sorry. It looks like ass.
Didn't you say that you're a pragmatist? :wink: I.e. using stuff that works vs. some ideal...
vSNES | Delphi 10 BPLs
bsnes launcher with recent files list
henke37
Lurker
Posts: 152
Joined: Tue Apr 10, 2007 4:30 pm
Location: Sweden
Contact:

Post by henke37 »

Dude, plugins. Just load one function from the plugin dll with a platform specific api and then just use the object the function returns.
byuu

Post by byuu »

Didn't you say that you're a pragmatist? Wink I.e. using stuff that works vs. some ideal...
Been planning to write a more powerful C++ preprocessor for a while now for the CPU/SMP cores. I'll probably use that to write a simple Win32 API function binder to cut all the red tape out.
Dude, plugins. Just load one function from the plugin dll with a platform specific api and then just use the object the function returns.
OpenAL is not object-oriented.
funkyass
"God"
Posts: 1128
Joined: Tue Jul 27, 2004 11:24 pm

Post by funkyass »

byuu wrote:
OpenAL is not object-oriented.
potential plug-in could be however. dll's don't have to be OO do they?
Does [Kevin] Smith masturbate with steel wool too?

- Yes, but don’t change the subject.
kode54
Zealot
Posts: 1140
Joined: Wed Jul 28, 2004 3:31 am
Contact:

Post by kode54 »

_willow_ wrote:
tetsuo55 wrote:In case of the X-Fi the best signal to send it is 32Bit/96khz for surround and 32Bit/192khz for stereo.
For my knowledge that is absolutely correct.
That's the way the internal DSP mixer works. It resample everything up to 32Bit/96khz for surround and 32Bit/192khz for stereo then resample down to DAC needs. If you feed the resampler the highest RAW bitrate possible i.e. 32Bit/192khz for stereo, it skip the upsampling, but not the downsampling part i believe. Not sure about surround 96->192 case. Moreover i think it upsample even 96khz for surround too. The problem is i do not believe the X-Fi's support 192khz natively, i mean it always scale 192khz down to 96khz.
Still a small chance ASIO shutting down resampling completely on X-Fi. As well as it missing 192khz in ASIO mode because it lacks it natively.

And you guys talking about kernel mode, lol :lol:
Good god, you people suck. You're still talking about resampling, only in software. Tests have shown that the X-Fi has high quality resampling, and in hardware at that, so why bother? Looking for something to use up all those megahurtz you paid for?

X-Fi is for playing games with EAX effects, plain and simple. If you want bit perfect ASSIO bullshit, buy a professional sound card. Of course, with that software resampling, it won't be bit perfect anymore, now will it? But who can hear the difference? (Of course, if you claim you can, I'll be glad to pay you a visit and administer a 50 caliber hearing test.)
tetsuo55 wrote:te best way i have found currently to avoid most of the resampling is

Pre-resample to 32bit/96khz, then asio to soundcard which outputs 24/96 to the dac.

It sounds a LOT better than directsound.
I bet you can't back that with double blind testing.
FitzRoy
Veteran
Posts: 861
Joined: Wed Aug 04, 2004 5:43 pm
Location: Sloop

Post by FitzRoy »

New changes look good. However, I think we need to clarify the semantic use of "adjust."

Frequency adjust as it is right now should just be "Input frequency". Because the value reflects the frequency itself, not an adjustment. The opposite mistake is made on the color values. "Brightness" should reflect 0 as no brightness. "Brightness adjust" should reflect 0 as an integer representation of the default brightness.

The question now is, what's the better system? I'm personally not a huge fan of the adjust.
Locked