Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
UP2 (squared)
#1
A very nice board this, intel based for a change, but running a version of linux. It wasn't massively hard to get set up as most of the graphic drivers come with it

I have the lower priced 2GB  Celeron dual core version, but its still fantastically fast compared to an ARM board. the more expensive quad core Pentium version was over my (at the time) $100 limit. I got it on a kickstarter for even more of a discount. Its now selling without case or (6v) Power for the base version at $145 ,so basically it is outside my self imposed price range. The 8G quad Pentiums got up to $350, not sure there's a good reason to buy these unless you need a portable windows 10 system.

Of course that power comes at a price, in this case a massive heat sink and case to hold it all together. But this beast is basically a mini PC, and you can install full windows on it if you want (I might)

Everything compiles and runs just fine....except.

All my model graphics are corrupted, not in a random way, just that their vertex info seems to be wrong. I assume, (perhaps wrongly) that even though its running GLES2.0 and GLSL100 it needs some more clarity in the data being sent to the shaders. I can indeed see that they are in the right positions, scale and rotations are correct, but the models themselves look more like the cubes you get out of car crushers in scrap yards... 

No idea why at the moment, it may be some kind of high/low bit arrangement, or it may be that some data is not initialising on Intel systems as it would on ARM (thats my current best guess), I'll play with it and find out.

One thing though, despite this odd corruption....this is the fastest board I've ever used, it blows all the others away... If I can fix the graphics it will be such a blast to work on.
Brian Beuken
Lecturer in Game Programming at Breda University of Applied Sciences.
Author of The Fundamentals of C/C++ Game Programming: Using Target-based Development on SBC's 



Reply
#2
As part of my efforts to get the UpCore working, (failed) I fired up this monster again, and its still a beast. Totally outperforms every other SBC I have, but there is an issue.

At first I thought it was a shader issue and have been labouring under that assumption for some time when I dipped back into it, but no, I've been a bit dim. The shaders work fine, after all the level OBJ file is rendering perfectly.

The problem is the MD2 loader loads a binary image,  and there is a subtle difference in how an ARM CPU and an INTEL CPU deals with data, ARM's can use big endian format for data, Intels use little Endian... its not something you will notice when code is running on its own, ARM's and Intels both save and load internal data as little endian, so an int is an int is an int however it is stored and loaded it will be exactly the same.
But data saved in one format may have to be converted to the other when its loaded.. MD2 data is always stored as little endian, but it does seem that the conversion of some of that data is loading in as big endian.

Now actually my binary loader does that, making it perfect for ARM, but screws up for Intel... I need to add either a conversion or a different loader for Intel and I need to be clear where the actual data is being scrambled,thats a low priority job though, I have to fix some of the chapter 7 files first.
Brian Beuken
Lecturer in Game Programming at Breda University of Applied Sciences.
Author of The Fundamentals of C/C++ Game Programming: Using Target-based Development on SBC's 



Reply
#3
Upgraded to Ubuntu 16.04LTS
Fairly painless, and I can't honestly say it is all in place since I only care about a few specific features. But its smooth and effective and easy though I do get a very odd command line brian@brian-UP-APL01-Invalid-entry-length-0-DMI-table-is-broken-Stop:~$
hmmm...not to worry, maybe a linux guru can explain what I did wrong.

Had to install SSH server and Mesa libs, but, all good, since the intel graphics are well documented I'm fairly sure that the mesa libs are using hardware.
Its still a blindingly fast unit , much faster even than the XU4 with its 8 cores, this dual core Celeron is impressive, I wish I had bought one of the Pentiums now..but the cost was wayyy to high.

This is also a OpenGLES3.2 system, so it will be a good thing to run a few test projects on as a max power system (that and a PC) and compare performance with a lesser system like a NanoT4 for example.

Where this might fall down though is in multicore coding which I'm doing some work on now for my students, as a simple dual core Celeron it really can only handle 1 thread from the main core.. That creates an interesting issue of what to send to a job manager and wether to use the 2nd core as a simple job system while allowing the main core to run the bulk of the program.

Entering lscpu gives us this.
Architecture:          x86_64
CPU op-mode(s):        32-bit, 64-bit
Byte Order:            Little Endian
CPU(s):                2
On-line CPU(s) list:   0,1
Thread(s) per core:    1
Core(s) per socket:    2
Socket(s):             1
NUMA node(s):          1
Vendor ID:             GenuineIntel
CPU family:            6
Model:                 92
Model name:            Intel® Celeron® CPU N3350 @ 1.10GHz
Stepping:              9
CPU MHz:               1077.146
CPU max MHz:           2400,0000
CPU min MHz:           800,0000
BogoMIPS:              2188.80
Virtualization:        VT-x
L1d cache:             24K
L1i cache:             32K
L2 cache:              1024K
NUMA node0 CPU(s):     0,1


Actually this makes me doubt my assertion about the messed up MD2's, both the raspberry and Up2 are little endian, as is the data....something is off, I'll set them up side by side and compare the data they send to the VBO which might give me clues.

But thats all for now with this, have to leave it for a while as I have Chapter 7 and 8 clean ups to do and also some Z80 and school work.
Brian Beuken
Lecturer in Game Programming at Breda University of Applied Sciences.
Author of The Fundamentals of C/C++ Game Programming: Using Target-based Development on SBC's 



Reply
#4
Video 
In case anyone has a solution to this, here's the video of the Up^2 trying to render the knight.


As you can see the scene, which is an OBJ and the debug line draws all work find, so it probably isn't my setting up of the shaders thats the issue.


And here's the same code running on a raspberry




I really wish I knew why this was happening, I'm pretty sure its a screw up in the data formatting but until I have time to dump a listing of the VBO's on both systems to compare I won't be sure.  For now I have to put it on the back burner, as I have lots of other stuff to do, if you are running any of the demos on an intel based board, let me know if you have problems? I may just be specific to the Up^2.
Brian Beuken
Lecturer in Game Programming at Breda University of Applied Sciences.
Author of The Fundamentals of C/C++ Game Programming: Using Target-based Development on SBC's 



Reply
#5
ohhhh it was a bug...small one but important, and probably not mine, though my activation of it was due to some sloppy code, it seems that this function in the binary reader

glm::vec3 BinaryReader::ReadCompressedVec3()
{
glm::vec3 vec;
vec.x = ReadByte();
vec.y = ReadByte();
vec.z = ReadByte();
return vec;
}

Behaves quite differently on intel chips, the byte that is read is sign extended (any value over 128 is considered negative in 2's compliment binary) creating a negative float (thats actually correct behaviour), rather than a value from 0-255 (which is the inncorrect behavoir of the ARM cast)
Its probably a bug in GLM rather than being a specific issue, someone coded the intel code to sign extend when loading a char into a float.

The fix is simple, make sure you only send the char part to the float, don't allow it to be sign extended

glm::vec3 BinaryReader::ReadCompressedVec3()
{
glm::vec3 vec;
vec.x = ReadByte() & 255;
vec.y = ReadByte() & 255;
vec.z = ReadByte() & 255;
return vec;
}

Or.. a more elegant fix is for ReadByte to return an unsigned char which is better, and what I should have done in the first place.

Quite a complex bug tbh, it took a lot of tracking down, and head scratching determining where the data was faulty, what data was faulty, where it was set up and finally where it was loaded...not an easy route..

But its all working fine now.  Big Grin the demo's will be updated accordingly for anyone using an intel based board, the fix does no harm to the ARM version.
Brian Beuken
Lecturer in Game Programming at Breda University of Applied Sciences.
Author of The Fundamentals of C/C++ Game Programming: Using Target-based Development on SBC's 



Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)