The PS4 RAM allocation is strange, but makes sense with the PS4 having a hard VRAM availability lock due to the OS.
As for the RAM speeds, it is time that people bringing up this topic step back and realize that this does not equal faster or higher quality graphics.
There is a vast different in how each OS handles GPU assets. With NT's GPU scheduling model and the new features in DX11.2, the OS only needs a tiny amount of high speed RAM.
Reference the Texture Tile technologies, using this model, 16mb of high speed RAM is more than enough to drive 60fps at 1080p, and the Xbox is going beyond that with a 32mb high speed cache.
The PS3 does not have a kernel level GPU scheduler, nor can it without replacing the FreeBSD based OS.
Currently only Windows NT offers kernel level GPU scheduling, which is why Microsoft switched to the WDDM/WDM 1.0 in Vista; an early variation of this technology was also used in the Xbox 360.
With the GPU scheduling technologies that ONLY exist in NT, the Xbox One will be able to offer much higher resolution textures without the need for lower resolution textures to be loaded and transition between them.
This increases not only the texture quality but also increases performance and frees up RAM, as the game doesn't have to manage multiple levels of asset texture qualities. DirectX programming (Preliminary)
(These change are also why DX11.2 will probably never be back ported to Windows 7, as its WDDM/WDM does not have the newer features needed, and the 8.1 kernel and WDDM would have to be strapped onto Windows 7 to get DX11.2 to work.)
Here is a simple video of the tiled resources, which is just one significant advantage when used with the GPU scheduler technologies exclusive to NT and why it is important to gaming.
The RAM speed is somewhat irrelevant on NT with the WDDM technologies that were designed to handled shared GPU assets and GPU management as NT only needs the higher speed eSRAM cache for the GPU to maintain fluid higher resolution graphics.
Even in PC games, ones that use the extra VRAM that is virtually created by NT's WDDM, are able to load the full resolution textures without a loss in GPU performance, as the OS is able to swap them in out and more efficiently than the game would do by having multiple resolution assets and having to manage them at different drawing distances.
(I wish someone would do an in-depth article on how the WDDM/WDM 1.3 in NT (8.1) works and how it correlates to DirectX 11.2 - which is also used by the Xbox One. I have yet to see any site do a more technical explanation of the technologies and how it affects gaming and why it matters to end users. If I get any extra time, I may post one to the forums.)