This pre-dates the recording of a complete attendees list.
Dave Herman:
explains CPU -> GPU
state of endianness
leading into DataView
web being written as little endian
cpu little endian
file format big endian
should default for DataView be changed to little endian?
can it be changed still?
Yehuda Katz:
Allen Wirfs-Brock:
Luke Hoban:
YK:
DH:
LH:
AWB:
General discussion about real use cases...
DH, AWB:
DH:
LH, DH:
DH:
File <-> CPU is the well define use case for little endianness
webGL is determined by system
Doesn't have raw data, colleagues have evidence of assumed little endianness - without checks
So what happens when people are implementing for big endianness?
Game consoles have little endian modes
How robust is that harder support? Unknown
Alternative: Instead of allocating little, work harder to simulate little
when shader is compiled, implement byte swapping.
There is no one making the case - but the web is illustrating the behaviour
YK:
DH:
AWB:
DH, AWB:
DH:
AWB:
DH:
Standardize Little Endian? No opposition.
Erik Arvidsson, YK:
LH:
DH:
Bill, Brendan Eich:
DH:
People are upset at disparity
Change DataView default to little endian?
Stance: not changing DataView
BE:
DH:
YK:
DH:
YK:
DH:
No...
If we say the web is big endian...
CPU GPU
[ L, , , H ] -> [ L, , , H ]
If we say little endian...
[ H, , , L ] -> transform -> [ L, , , H ]
YK:
Summary...
Doug Crockford:
BE:
DH:
BE:
DH:
DC:
DH:
In typeed arrays, thhere are two typed of data structures:
DataView -
BE, DH, YK:
DH:
(stepped out, lost track...)
s = struct({
x: uint8,
y: uint32
})
d = new DV(buf, o)
v = d.get(s, 17)
v...
x
----> object pointing to offset starting at 17
y