Tileset Help-RGB in 16-bit vs 8 bit
ZolleH =)
I've been working on a tileset with a single colour theme for a while now, and have discovered an absurd problem which I cannot fix. I hope someone can find a soloution to this ^_^
I have been working with a range of colour[ex:15,15,15 to 31,31,31] in my pallete in the extra spots for colour using the standard JJ2 pallete. When I compiled and ran the tileset in JJ2, it appeared fine at first in 8-bit, but when I viewed it in 16-bit gameplay, all of the colours were merged together, leaving only about 5 different shades of what I originally had in 8-bit gameplay. The question is, how can you get a large, gradiented colour range to appear in 16-bit gameplay the same way it appears in 8-bit gameplay, without any colors merging? Or is that possible? ^_^ It still looks like this whether I choose to remap the tileset pallete or not...
I had suspected that perhaps the increments of the my colour range were at fault, since they only increased by [1,1,1] for each individual pallette box ^_^ Is this range too small for Jazz to use in 16-bit gameplay? The weird thing is that it looks perfect in 8-bit gameplay, which uses *less* colours, but looks horrid in 16-bit gameplay which uses *more* colours.
Thankz ^_^ (That is, IF you helped =P =) )
__________________
"...Mess with the best, die like the rest" -Gotta love that quote ^_-
|