ViewSonic ViewPad 7 Android Tablet Review and Specifications

It seems like all been waiting for "the iPad killer." It's not that anybody thinks Apple's slate device needs to be knocked off its perch directly, but let's be honest, consumers benefit from competitive options to choose from, whether it be strictly on price or performance and innovation. 7 and 10-inch Android and Windows based tablets have been trickling out from various manufacturers over the past few months, though the pace of both development and release of these devices doesn't appear to be happening fast enough to keep pace with Apple this holiday season.  Sure, Samsung stepped up with some significant buzz for the Galaxy Tab, but in a lot of ways the total solution just didn't have the same punch as Apple's new thin and light ultra-portable. while Android continues to become more robust as an OS and manufacturers from all over continue to polish and refine new slate PCs.

ViewSonic is a household name that many consumers can identify with in terms of their lineage in the LCD market.  So, at least on the surface, it would make sense that a panel manufacturer (akin to Samsung actually) would have solid leverage in components and materials, to compete in the white-hot tablet arena. A preview of their 10-inch Tegra 2-based g-tablet, not long ago and actually have that in house right now for testing, though there are OS updates coming that hearing should offer a better experience. In the meantime, also have their 7-inch ViewPad 7 tablet here and it has been recently buffed out with Android 2.2 for what is arguably the best tablet experience on the market currently, at least on this side of the Apple fence. The ViewPad 7's 7-inch form factor is decidedly more portable than a 10-inch slate, and this device has every IO option you could ever want, including micro-USB, micro-SD card, and SIM card slots, as well as front and rear facing cameras. 

ViewSonic ViewPad 7 Specifications
Android 2.2-Driven Portability
  • Android 2.2 (Froyo) Operating System
  • 600MHz Qualcomm ARM11 Processor
  • Adreno 200 Graphics Core
  • Front VGA Camera and 3MP Rear-Facing Camera
  • Android Google Mobile Services (GMS)
  • 512 MB of ROM, 512 MB RAM
  • 3G Cellular Radio with Sim Card Slot
  • Wi-fi 802.11b/g
  • Bluetooth 2.1
  • 800×480 WVGA LCD screen
  • Capacitive Multi-Touch
  • 3,240mAh Lithium Polymer Battery
  • 4–6 hrs (heavy, continuous use) 600 hrs Standby
  • Mini USB Port and Micro SD Slot Up To 32GB
  • 3.5mm Audio Jack
  • G-Sensor, E-Compass and Ambient Light Sensor
 
  • Access to Android Market Place
  • eReader TXT, HTML, EPUB, PDF, Office
  • ViewPad 7, Charger, Leatherette Cover, USB Cable, Earphones, Quick Start Guide
  • Pre-loaded Apps

USB, Bluetooth, micro-SD card slot, SIM card slot -- that's a laundry list of specifications above, many of which iPad owners wish they could lay claim to as well, save perhaps for the 600MHz ARM11 CPU and only 512MB of internal storage.  However, drawing parallels to the iPad really isn't the right approach.  The two are very different devices really, though competing in the same product segment. In addition, micro-SD cards are cheap, so dropping in another 16 - 32GB of storage could be a small $25 - $50 upgrade.  And as you'll find out, that 600MHz ARM11 isn't quite as underpowered as you might think.  Let's drop down another level for a closer look.

Western Digital WD TV Live Hub Review

It whispered, "If you stream it, they will come." After hearing this, he marched into the board room and pitched his idea for a streaming media player, and thus the WD TV series was born. Now whether or not it actually played out like this is irrelevant (it didn't); what matters is that Western Digital did build a line of streaming set-top boxes, and the customers have certainly shown up.

Western Digital's WD TV Live Hub, which is what looking at today, is the company's fourth generation media streamer, and it's the most fully functional to date. Unlike the previous generation WD TV Live Plus, as well those that came out before it, this latest iteration adds several welcome additions, including a built-in 1TB hard drive, a built-in media server to stream content to multiple rooms, and more apps than before, including the ability to download movies and TV shows from Blockbuster On Demand.

The idea here is simple. Just plop the WD TV Live Hub into your home theater, connect it to your home network, and proceed shuttling movies, photos, and music back and forth from any of your network-connected PCs to the set-top box, and/or from the set-top box to any of your network-connected PCs. And while you're at it, you can tap into your Pandora account, Facebook news feed, watch Netflix videos, and a whole bunch more all without the complication or cost of integrating a true home theater PC into your living room.




 
WD TV Live Hub
Specifications & Features
 CPU   Sigma Designs 500MHz
 Platform  Mochi
 Internal Storage  1TB
 Video Formats
 AVI (Xvid, AVC, MPEG 1/2/4), MPG/MPEG, VOB, MKV (H.264, X.264, AVC, MPEG 1/2.4, VC-1), TS/TP/M2T (MPEG 1/2/4, AVC, VC-1), MP4/MOV (MPEG4, H.264), M2TS, WMV9
 Photo Formats
 JPEG, GIF, TIF/TIFF, BMP, PNG
 Audio Formats
 MP3, WAV/PCM/LPCM, WMA, AAC, FLAC, MKA, AIF/AIFF, OGG, Dolby Digital DTS
 Playlist  PLS, M3U, WPL Subtitle -- SRT, ASS, SSA, SUB, SMI
 Connectivity  Gitabit Ethernet, USB 2.0, HDMI, Composite A/V, Component Video, Optical Audio
 Dimensions  1.25 x 7.80 6.10 inches (H x D x W)
 Weight  1.22 Pounds
 OS Support
 Windows / Mac

Perhaps most impressive right off the bat is the number of video, photo, and audio formats the WD TV Live Hub supports. Western Digital warns that you won't be able to play "protected premium content such as movies or music from the iTunes Store, Cinema Now, Movielink, Amazon Unbox, and Vongo," but pretty much everything else is fair game.

Also like that Dolby Digital DTS is thrown in the mix, and the various connectivity options are a definite plus. It's clear that Western Digital put a lot of effort into making sure its latest media set-top box would integrate seamlessly into just about any home theater/network setup (sans Linux).

ASUS Eee Pad Slider Regardless of OS Review

Regardless of OS, they all provide a far more intimate experience when browsing the web and reading emails. I genuinely prefer doing both of those things on a tablet than on a notebook or desktop. Then there are the apps. Photos, maps, ebooks, videos and even IP cameras are comfortably accessible from tablets. Obviously you can do the same on a notebook or desktop, the tablet form factor combined with a responsive touch UI simply means you can do these things in a more relaxed position.

What has always frustrated me with tablets however is what happens when you have to give any of these apps a significant amount of input. While the virtual keyboards on tablets are pretty mature, the form factor doesn't allow for quick typing like on a smartphone. A smartphone is easily cradled in both of your hands while your thumbs peck away at the keyboard. A tablet however needs to be propped up against something while you treat it like a keyboard. Put it on your lap and you have to hunch over the thing because the screen and input surface are on the same plane (unlike a notebook where the two are perpendicular to one another). Try to type in a reclined position on a couch and you end up lying awkwardly with your thighs and thumbs supporting the tablet. Ever see the iPad billboards and note the really awkward leg placement in them?
The excuse for the tablet has always been that it's a consumption device, not one for productivity. But what if I want to browse the web and respond to long emails? Must I keep switching between a tablet and a notebook, between consumption and productivity device? That has always seemed silly to me. In striving for comfort and efficiency it seems that having to constantly switch between two large devices would be both uncomfortable and inefficient. After all, who browses all of the web then switches to only writing emails without intermixing the two. Perhaps these discrete usage models are somewhat encouraged by the lack of true multitasking (rather than task switching) of modern tablet OSes, but eventually things must change.



Windows 8 alone will bring change as it finally addresses the issue of having two things on your screen at once. On today's tablets, for the most part, once you're in an application that's all you get to interact with. One of the biggest issues I have is it's virtually impossible to carry on an IM conversation on a tablet while doing anything else. Without constantly (and frustratingly) switching between apps, it's impossible to have a conversation and browse the web for example.

What about on the hardware side of things? Bluetooth keyboards and keyboard docks have existed since they very first of this new generation of tablets hit the market. These accessories have all been very functional but they do tend to hinder the portability of tablets. With its Eee Pad Transformer, ASUS addressed the issue by offering a keyboard dock that would turn the tablet into an Android netbook while extending its battery life. The end result was an extremely flexible device, but it still required that you either carry around a significantly bulkier tablet or made a conscious decision to take one or both pieces of the setup (tablet + dock).

Continuing down this road of experimenting with transformable tablets, ASUS' next attempt to bring the best of both tablet and netbook worlds comes in the form of the Eee Pad Slider.


Eee Pad Transformer + Dock (left) vs. Eee Pad Slider (right)
The Slider takes the same basic Eee Pad tablet from the Transformer and integrates a slim, sliding keyboard. You only get a single battery (25 Wh) but you get a much thinner and lighter form factor than the Transformer with its dock.
2011 Tablet Comparison
ASUS Eee Pad Transformer ASUS Eee Pad Transformer + Dock ASUS Eee Pad Slider Samsung Galaxy Tab 10.1
SoC NVIDIA Tegra 2 (Dual ARM Cortex A9 @ 1GHz) NVIDIA Tegra 2 (Dual ARM Cortex A9 @ 1GHz) NVIDIA Tegra 2 (Dual ARM Cortex A9 @ 1GHz) NVIDIA Tegra 2 (Dual ARM Cortex A9 @ 1GHz)
GPU NVIDIA GeForce NVIDIA GeForce NVIDIA GeForce NVIDIA GeForce
RAM 1GB 1GB 1GB 1GB
Display 1280 x 800 IPS 1280 x 800 IPS 1280 x 800 IPS 1280 x 800 PLS
NAND 16GB 16GB 16GB 16GB
Dimensions 271 x 175 x 12.95mm 271 x 183 x 16 - 28mm 273 x 180.3 x 17.3 - 18.3mm 256.6 x 172.9 x 8.6mm
Weight 695g 1325g 960g 565g
Price $399 $550 $479 $499

The price isn't as attractive as the base Eee Pad Transformer. At $479 for the 16GB WiFi version you're now well into Galaxy Tab/iPad 2 territory, but you do get a built-in keyboard. Samsung's keyboard for the Galaxy Tab is priced at $50 while Apple's Bluetooth keyboard for the iPad 2 (and Macs) will set you back $70. When viewed this way, the Slider is still a steal but if the recent TouchPad sale and Kindle Fire release taught us anything it's that there's a huge market for non-Apple tablets, just not at $500. ASUS was on the right track by pricing the Eee Pad Transformer at $399, but the Slider at $479 takes a step in the wrong direction.

The Display & Hardware

The Slider starts out very similarly to the Transformer. You get a 10.1-inch IPS panel with a Honeycomb-standard 1280 x 800 display (1920 x 1200 will be what the next-gen of Android tablets will sport). The display is near-identical to what ASUS used in the transformer. Max brightness ends up at an iPad 2-like 378 nits, while overall contrast ratio appears to have improved a bit thanks.

Display Brightness
Display Brightness
Display Contrast

ASUS does need to start calibrating these panels at the factory though. The Slider's white point is set to 7700K.

Viewing angles are all great, the only issue with the Slider's display is the large gap between the outermost glass and the LCD panel itself. The additional glare is a problem in any case where there's a direct light shining on the screen. Most of these tablets aren't good outdoors in direct sunlight to begin with, but this issue does make the Slider a bit more annoying to use compared to the iPad 2 or Galaxy Tab 10.1 for example.
All of the outward facing materials are either glass or soft touch plastic, a subtle but noticeable improvement over the Transformer. The smell of the soft touch plastic is distinct but not all that pleasant. Here's hoping it fades quickly. The durability of the soft touch coating is also a concern.


ASUS was smart enough to include five rubber feet on the back of the Slider. With the keyboard deployed the Slider's back serves as its stand, so the feet are necessary to keep your Eee Pad pristine. The overall design is clearly ASUS' own creation, but wouldn't call it particularly memorable. What matters the most is that it's functional and there can be no question of that.

The perimeter of the Slider is ports-a-plenty. On the right edge of the tablet is a full sized USB 2.0 port and headphone jack. On the left there's a microSD slot and along the top there's ASUS' dock connector and mini HDMI out (type C connector). Charging is handled via the same USB adapter that shipped with the Eee Pad Transformer.


Power, reset and volume up/down are also located on the left side of the tablet. Yes, that's right, there's an actual reset button on the Eee Pad Slider. The button is recessed as to avoid any accidental activation. A single click of it will reset the Slider, no questions asked.


I'm actually very happy there is a reset button the tablet. As these devices become even more PC-like expect them to encounter the same sort of stability issues any hardware running complex software has to deal with.
The Slider has two cameras: a 5MP rear facing module and 1MP front facing unit. There's a subtle, smartphone-sized bulge around the rear camera module. The bulge is noticeable but it doesn't clear the height of the rubber feet so you don't have to worry about resting your tablet on the rear camera.
The Slider is significantly heavier than the stock Eee Pad (without dock) for obvious reasons. And compared to the Samsung Galaxy Tab, well, there's just no comparison there. That being said, the Slider is still much nicer to carry around that the Eee Pad + dock (it's far less bulky) and it's more convenient than most notebooks in this price range. You really do get the full tablet experience with much of the notebook experience thanks to the integrated keyboard.

Qualcomm's Snapdragon S4: MSM8960 & Krait Architecture Explored

Let's recap the current smartphone/tablet SoC landscape. Everything shipping today is built on a 4x-nm process, built either at Global Foundries, Samsung, TSMC or UMC. Next year we'll see a move to 28nm (bringing better performance and power characteristics) but between now and the end of 2012 there will be a myriad of designs available on the market.

The table below encapsulates much of what you can expect over the next 12 months:


2011/2012 SoC Comparison
SoC Process Node CPU GPU Memory Bus Release
Apple A5 45nm 2 x ARM Cortex A9 w/ MPE @ 1GHz PowerVR SGX 543MP2 2 x 32-bit LPDDR2 Now
NVIDIA Tegra 2 40nm 2 x ARM Cortex A9 @ 1GHz GeForce 1 x 32-bit LPDDR2 Now
NVIDIA Tegra 3/Kal-El 40nm 4 x ARM Cortex A9 w/ MPE @ ~1.3GHz GeForce++ 1 x 32-bit LPDDR2 Q4 2011
Samsung Exynos 4210 45nm 2 x ARM Cortex A9 w/ MPE @ 1.2GHz ARM Mali-400 MP4 2 x 32-bit LPDDR2 Now
Samsung Exynos 4212 32nm 2 x ARM Cortex A9 w/ MPE @ 1.5GHz ARM Mali-400 MP4 2 x 32-bit LPDDR2 2012
TI OMAP 4430 45nm 2 x ARM Cortex A9 w/ MPE @ 1.2GHz PowerVR SGX 540 2 x 32-bit LPDDR2 Now
TI OMAP 4460 45nm 2 x ARM Cortex A9 w/ MPE @ 1.5GHz PowerVR SGX 540 2 x 32-bit LPDDR2 Q4 11 - 1H 12
TI OMAP 4470 45nm 2 x ARM Cortex A9 w/ MPE @ 1.8GHz PowerVR SGX 544 2 x 32-bit LPDDR2 1H 2012
TI OMAP 5 28nm 2 x ARM Cortex A15 @ 2GHz PowerVR SGX 544MPx 2 x 32-bit LPDDR2 2H 2012
Qualcomm MSM8x60 45nm 2 x Scorpion @ 1.5GHz Adreno 220 2 x 32-bit LPDDR2* Now
Qualcomm MSM8960 28nm 2 x Krait @ 1.5GHz Adreno 225 2 x 32-bit LPDDR2 1H 2012

The key is this: other than TI's OMAP 5 in the second half of 2012 and Qualcomm's Krait, no one else has announced plans to release a new microarchitecture in the near term. Furthermore, if we only look at the first half of next year, Qualcomm is the only company that's focused on significantly improving per-core performance through a new architecture. Everyone else is either scaling up in core count (NVIDIA) or clock speeds. As we've seen in the PC industry however, generational performance gaps are hard to overcome - even with more cores or frequency.

Qualcomm has an ARM architecture license enabling it to build its own custom micro architectures that implement the ARM instruction set. This is similar to how AMD has an x86 license but designs its own chips rather than just producing clones of Intel processors. Qualcomm remains the only active player in the smartphone/tablet space that uses its architecture license to put out custom designs. The benefit to a custom design is typically better power and performance characteristics compared to the more easily synthesizable designs you get directly from ARM. The downside is development time and costs go up tremendously.
Scorpion was Qualcomm's first Snapdragon CPU architecture. At a high level, it looked very much like an optimized ARM Cortex A8 design although the two had nothing in common outside of instruction set. Scorpion was a dual-issue, in-order architecture that eventually scaled to dual-core and 1.5GHz variants.
Scorpion was pretty much the CPU architecture of choice in the 2009 - 2010 timeframe. Throughout 2011 however, Qualcomm has been very quiet as dual Cortex A9 designs from NVIDIA, Samsung and TI have surpassed it in terms of performance.

Going into 2012, Qualcomm is set for a return to glory as it will be the first to deliver a brand new microprocessor architecture and the first to ship 28nm SoCs in volume. Qualcomm's next-generation SoCs will also be the first to integrate an LTE modem on-die, which should enable LTE on nearly all high-end devices at much better power levels than current multi-chip 4x-nm solutions. Today we're able to talk a bit about the architecture details and performance expectations of Qualcomm's next-generation SoC due out in the first half of 2012.

Krait Architecture


The Krait processor is the heart of Qualcomm's second generation Snapdragon and it's the core of all Snapdragon S4 SoCs. Krait takes the aging base of Scorpion and gives it a much needed dose of adrenaline.
Krait's front end is significantly wider. The architecture can fetch and decode three instructions per clock. The decoders are equally capable of decoding any ARMv7-A instructions. The wider front end is a significant improvement over the 2-wide Scorpion core. It alone will be responsible for a tangible increase in IPC.

Architecture Comparison
  ARM11 ARM Cortex A8 ARM Cortex A9 Qualcomm Scorpion Qualcomm Krait
Decode single-issue 2-wide 2-wide 2-wide 3-wide
Pipeline Depth 8 stages 13 stages 8 stages 10 stages 11 stages
Out of Order Execution N N Y Partial Y
FPU VFP11 (pipelined) VFPv3 (not-pipelined) Optional VFPv3-D16 (pipelined) VFPv3 (pipelined) VFPv3 (pipelined)
NEON N/A Y (64-bit wide) Optional MPE (64-bit wide) Y (128-bit wide) Y (128-bit wide)
Process Technology 90nm 65nm/45nm 40nm 40nm 28nm
Typical Clock Speeds 412MHz 600MHz/1GHz 1.2GHz 1GHz 1.5GHz
The execution back-end receives a similar expansion. Whereas the original Scorpion core only had three ports to its execution units, Krait increases that to seven. Krait can issue up to four instructions in parallel. The additional execution ports simply help prevent any artificial constraints on ILP. This is another area where Krait will be able to see significant IPC gains.

Krait's fetch and decode stages are obviously in-order, but the back-end is entirely out-of-order. Qualcomm claims that any instruction can be executed out of order, assuming that doing so doesn't create any new hazards. Instructions are retired in order.


Qualcomm lengthened Krait's integer pipeline slightly from 10 stages in Scorpion to 11 stages in Krait. Load/store operations tack on another two cycles and instructions that go through the Neon/VFP path further lengthen the pipe. ARM's Cortex A15 design by comparison features a 15-stage integer pipeline.

Qualcomm's design does contain more custom logic than ARM's stock A15, which has typically given it a clock speed advantage. The A15's deeper pipeline should give it a clock speed advantage as well. Whether the two effectively cancel each other out remains to be seen.

Qualcomm Architecture Comparison
  Scorpion Krait
Pipeline Depth 10 stages 11 stages
Decode 2-wide 3-wide
Issue Width 3-wide? 4-wide
Execution Ports 3 7
L2 Cache (dual-core) 512KB 1MB
Core Configurations 1, 2 1, 2, 4

Krait has been upgraded to support the new virtualization instructions added in Cortex A15. Also like the A15, Krait enables LPAE for 40-bit memory addressing.

At a high-level Qualcomm has built a 3-wide, out-of-order engine that feels very much like a modern version of Intel's old P6. Whereas designs from the A8 generation looked like modern Pentiums, Krait takes us into the era of the Pentium II.

Note that courtesy of the wider front-end and OoO execution engine, Krait should be a higher performance architecture than Intel's Atom. That's right, you'll be able to get better performance than some of the very first Centrino notebooks in your smartphones come 2012.

Performance Expectations

Performance of ARM cores has always been characterized by DMIPS (Dhrystone Millions of Instructions per Second). An extremely old integer benchmark, Dhrystone was popular in the PC market when I was growing up but was abandoned long ago in favor of more representative benchmarks. You can get a general idea of performance improvements across similar architectures assuming there are no funny compiler tricks at play. The comparison of single-core DMIPS/MHz is below:

ARM DMIPS/MHz
  ARM11 ARM Cortex A8 ARM Cortex A9 Qualcomm Scorpion Qualcomm Krait
DMIPS/MHz 1.25 2.0 2.5 2.1 3.3

At 3.3, Krait should be around 30% faster than a Cortex A9 running at the same frequency. At launch Krait will run 25% faster than most A9s on the market today, a gap that will only grow as Qualcomm introduces subsequent versions of the core. It's not unreasonable to expect a 30 - 50% gain in performance over existing smartphone designs. ARM hasn't published DMIPS/MHz numbers for the Cortex A15, although rumors place its performance around 3.5 DMIPS/MHz.

Updated VeNum Unit

ARM's NEON instruction set is handled by a dedicated unit in all of its designs. Krait is no different. Qualcomm calls its NEON engine VeNum and has increased its issue capabilities by 50%. Whereas Scorpion could only issue two NEON instructions in parallel, Krait can do three.
Qualcomm's NEON data paths are still 128-bits wide.

HTC Sensation 4G Review - A Sensational Smartphone

I like what HTC has been up to lately. Rather than fighting a race to the bottom with endless soulless variants of the same piece of hardware in a crowded (and fiercely competitive) Android handset market, it’s trying to grow beyond just being a handset manufacturer.

I hate starting reviews with history lessons, but in this case we really do need to step back to see where HTC is coming from. In the beginning, HTC was a nameless OEM for other more famous brands. Its clients were smartphone and Pocket PC names like Palm with its Treo, Compaq with its iPaq, Dell with a number of the Axim PDAs, and UTStarcomm. As Windows Mobile aged and showed little signs of improving, HTC took its first step outside the bounds of being just a hardware assembler by taking on an ambitious project to revitalize Windows Mobile with a software skin. The fruits of this effort were TouchFlo, and later TouchFlo 3D UIs - which eventually would become HTC Sense. Somewhere between the release of the HTC Mogul and HTC Touch Pro, HTC realized that its future wasn’t purely in manufacturing devices for other handset vendors, but in leveraging its own brand. The combination of continually improving industrial design, software, and its own direction have turned HTC into the device manufacturer it is today.
Things have come a long, long way since the HTC Dream, and today we’re looking at HTC’s latest and greatest with the HTC Sensation.


I get a bit excited every time I look at the HTC Sensation. It’s a device with perhaps the strongest and most bold design language of any HTC phone to date. You can pretty much chart HTC’s design language by looking at each generation of its international handsets.


The HTC Desire was essentially an international version of the Nexus One, with hardware buttons but the same 65nm single core Snapdragon QSD8250 SoC. The second generation was the HTC Desire HD, which brought a larger 4.3” screen and 45nm Snapdragon MSM8255 SoC. The third step is the HTC Sensation, which ups resolution from WVGA 800x480 to qHD 960x540 and brings a 45nm dual core Snapdragon MSM8260 SoC.

Physical Comparison

Apple iPhone 4 HTC Thunderbolt LG Optimus 2X/G2x HTC Sensation
Height 115.2 mm (4.5") 122 mm (4.8") 123.9 mm (4.87") 126.3 mm (4.97")
Width 58.6 mm (2.31") 67 mm (2.63") 63.2 mm (2.48") 65.5 mm (2.58")
Depth 9.3 mm ( 0.37") 13.2 mm (0.52") 10.9 mm (0.43") 11.6 mm (0.46")
Weight 137 g (4.8 oz) 183.3 g (6.46 oz) 139.0 g (4.90 oz) 148 g (5.22 oz)
CPU Apple A4 @ ~800MHz 1 GHz MSM8655 45nm Snapdragon 1 GHz Dual Core Cortex-A9 Tegra 2 AP20H 1.2 GHz Dual Core Snapdragon MSM8260
GPU PowerVR SGX 535 Adreno 205 ULP GeForce Adreno 220
RAM 512MB LPDDR1 (?) 768 MB LPDDR2 512 MB LPDDR2 768 MB LPDDR2
NAND 16GB or 32GB integrated 4 GB NAND with 32 GB microSD Class 4 preinstalled 8 GB NAND with up to 32 GB microSD 4 GB NAND with 8 GB microSD Class 4 preinstalled
Camera 5MP with LED Flash + Front Facing Camera 8 MP with autofocus and dual LED flash, 720p30 video recording, 1.3 MP front facing 8 MP with AF/LED Flash, 1080p24 video recording, 1.3 MP front facing 8 MP AF/Dual LED flash, VGA front facing
Screen 3.5" 640 x 960 LED backlit LCD 4.3” 800 x 480 LCD-TFT 4.3" 800 x 480 LCD-TFT 4.3" 960 x 540 S-LCD
Battery Integrated 5.254Whr Removable 5.18 Whr Removable 5.6 Whr Removable 5.62 Whr
Physically it’s obvious that each successive device builds on the former. They’re all backed with HTC’s trademark purple-grey metal and have similar in-hand feel as a result. When I look at the Sensation, I see the Desire crossed with the Desire HD. When I actually hold the Sensation, I feel like I’m holding a grown-up Nexus One.


The two share that trademark combination of slightly rubbery plastic and metal, and as a result the device feels grippy, solid, and confident. What the Sensation also really continues from the other devices is the lack of a hard lip of any kind at the edge, instead every corner rolls off giving the phone a smooth feeling. The sensation of holding something rigid and expensive is communicated by that combination of materials, rather than the cheap plasticky feel conveyed by a number of other handsets.


The Sensation comes in the same style of packaging that we've seen other T-Mobile phones arrive in. It's a two-part box with a thin middle strip. The top lifts off revealing the phone, and underneath that is the usual paperwork, HTC AC adapter and microUSB cable, and earbuds.

Desktop Llano Overclocking ASRock A75 Extreme6 Review

Our initial tests with the ASRock A75 Extreme6 were based on a pre-release model, and shown in our preview.  At that point, the board design was not finalized and the BIOS was still quite raw, but the performance was essentially complete.  However, now in my grasp is the full release version of the Extreme6.  Alongside this standard motherboard review, and testing to see whether it's worth the $150 asking price, we're also going to take a good look at the overclocking features of the Desktop Llano chipset.

I've essentially run our motherboard test suite on two versions of this board now, and having recently played with Cougar Point and AMD Fusion (review to come), it slots nicely in the middle in most aspects - especially CPU power.  We're not seeing anything special here with Desktop Llano - as Anand pointed out, in the region of Phenom II X4 performance.  For the integrated GPU, it's a different matter, with up to 2x the performance of the highest version of Intel's integrated graphic solutions of the Sandy Bridge second generation Core series.  Both of these will come in the form of results I will look at later.
 
Overview

The ASRock A75 Extreme6 is aimed at the high end Desktop Llano workspace, and, from what ASRock are telling me, will ship in at $150.  Unfortunately, I haven't other A75 boards to compare it to yet, but I get the same steady feeling I've got from ASRock boards of late - it's just something that works.  There's one minor glitch for now that I've found, to do with memory compatibility, but what we're dealing with here on the board is a good number of features, and the only real downside is the Llano chip itself.
If we look at the prices of the 890 chipset, most motherboards are in the $100-$145 range, so in terms of CPU performance, we're paying a little extra.  With the integrated APU at hand, however, you could see where the extra comes from - a different power integration arrangement, and the need for various video output connectors.  Personally, I see Desktop Llano more of a niche area, so it will be interesting to see how aggressive the motherboard manufacturers are with pricing.
 
Visual Inspection

Desktop Llano features the same retention bracket spacing as AM2 and AM3, which, in all honesty, is quite big compared to Intel's chipsets.  As a result, we see a bunched up power delivery to the west of the socket itself, covered in a large heatsink of ASRock's design (this is a main difference to the preview board I tested earlier).  The chokes on this design are still of the old design - if you can recall, during my tour with ASRock at Computex, I noticed they were using a different choke design on their high end boards to improve power delivery, heat generation and efficiency (similar to MSI's SFC).  It seems that the new design still hasn't filtered down into their non-gaming products; despite the fact that I was told that the two choke designs cost the same.

The socket area itself has five fan headers, of which I highly approve and would like to see on every other motherboard on the market - I'm using a Corsair H50, which in dual fan mode requires three (two for fans, one for the pump), so having another couple at hand for case fans is always a plus.  Two of the fan headers are labelled for the CPU (one four pin, one three pin), the two under the power delivery are chassis headers (both three pin), and the one above the power delivery is a PWR fan (again, three pin).  Elsewhere on the board is one more fan header, labelled as chassis (another three pin).



Beyond the 24-pin ATX connector is a USB 3.0 header, but unfortunately no USB 3.0 bracket comes with this ASRock board like their Sandy Bridge Extreme series.  The eight (yes, eight) SATA 6 Gb/s ports come in a couple of flavors - six from the Fusion Controller Hub (FCH), and two more from an ASMedia controller.  We also see the Power/Reset/Debug LED combo, which I approve of.
 

The FCH heatsink is ASRock's low profile, silent design, which actually gets quite warm when anything is overclocked.  PCIe design is x1, x16, PCI, PCI, x16, PCI, x4 - with the x16 slots running at x8/x8 mode when two discrete GPUs are being used.


The I/O panel is actually fairly bare, as ASRock haven't decided to stack anything on top of the HDMI.  With some extra effort, I'm sure a pair of USBs could be stacked on top of it.  Actually, in my initial preview piece, I stated how the USB ports on the far left (physically at the top of the board) were quite tough to get anything in beyond a USB mouse or keyboard if the DVI video out was used.  ASRock's response was to 'use a USB port extender - we had to cater for those using DVI + HDMI at the same time', which in my eyes is a stone's throw away from 'oops, we didn't spot that'.  Other connectivity comes in the form of a PS/2 port, two native USB 3.0 ports, two USB 3.0 ports from an ASMedia ASM1042 controller, a VGA port, two USB 2.0 ports, an eSATA 3 Gbps port, Realtek controller enabled gigabit Ethernet, a Firewire port, an optical SPDIF out, and the standard audio outputs.

Novatel Wireless MiFi 4510L Review - The Best 4G LTE WiFi Hotspot

A while back we explored almost all of Verizon’s 4G LTE network launch hardware - two USB modems, the Samsung SCH-LC11 hotspot, and the HTC Thunderbolt, to be exact. Since then, one more WiFi hotspot product has launched which we’ve been playing with for a long time, the MiFi 4510L from Novatel Wireless. The SCH-LC11 was a decent hotspot but still didn’t quite nail everything.



It’s pretty amazing to me how Novatel Wireless’ MiFi brand quickly became so synonymous with portable cellular hotspots. The MiFi 2200 is an iconic product that pops up just about everywhere and has enjoyed well-deserved, almost unchallenged success on practically every single CDMA2000 carrier in the US. For many smartphone users, using things like WMWifiRouter and other similar software tools (long before Android added its own wireless AP) that made a smartphone into a portable WiFi access point were old hat, but Novatel’s MiFi was a nicely packaged solution that was much easier to swallow. Novatel has kept the MiFi updated, but has primarily focused on versions with exclusively 3GPP (GSM/UMTS) connectivity. The deployment of Verizon’s 4G LTE network thus necessitated another update, and Novatel’s answer is the MiFi 4510L, which includes support for the carrier’s 700 MHz LTE and 1900 / 800 1x/EvDO Rev.A networks.


 
The LTE-enabled MiFi 4510L next to its older sibling, the MiFi 2200

I was a bit surprised to see Novatel beaten at launch time by Samsung, whose SCH-LC11 hotspot we reviewed came before the MiFi variant by a fair margin. At the time, I was satisfied with the SCH-LC11 but still looking for a few additional important things. Chief among those were 5 GHz 802.11a/n support, ability to change between using the device as a modem or charge when connected over USB, GPS support, and more customization options inside the web control pages.


 
Samsung's SCH-LC11 hotspot

Unfortunately the MiFi 4510L doesn’t really bring anything different on those fronts, and as we’ll see in a moment actually is based around the exact same combination of MDM9600 baseband and WCN1312 WLAN with no discrete applications processor.

First things first, and that’s how the MiFi 4510L compares physically to its predecessor, and the SCH-LC11. Unfortunately, the SCH-LC11 had to go back to Verizon before the MiFi 4510L came, so I don’t have any side by side shots with that device, however the table below tells the story of how the two compare when it comes to size and weight.

Portable Hotspot Comparison
  Novatel Wireless MiFi 2200 Samsung SCH-LC11 Novatel Wireless MiFi4510L
Height 59 mm (2.32") 59 mm (2.32") 60 mm (2.36")
Width 89 mm (3.50") 90 mm (3.54") 95 mm (3.74")
Depth 8.8 mm (0.35") 11 mm (0.43") 13 mm (0.53")
Weight 59 g (2.08 oz) 81.5 g (2.87 oz) 88.6 g (3.13 oz)
Network Support 800 / 1900 (1x/EvDO Rev.A/0) 700 MHz (LTE), 800 / 1900 (1x/EvDO Rev.A/0) 700 MHz (LTE), 800 / 1900 (1x/EvDO Rev.A/0)
Battery Size Removable 4.25 Whr Removable 5.55 Whr Removable 5.6 Whr

It’s clear to me that the original MiFi 2200 is still the form factor to beat, leading both in overall package volume, and weight. The move to LTE has necessitated both a PCB that spans the entire length of the board, and a larger battery, and that definitely shows.

The Sandy Bridge Pentium Review: G850, G840, G620 & G620T Tested

In 2006 Intel introduced its tick-tock cadence for microprocessor releases. Every year would see the release of a new family of microprocessors as either a tick or a tock. Ticks would keep architectures relatively unchanged and focus on transitions to smaller manufacturing technologies, while tocks would keep fab process the same and revamp architecture. Sandy Bridge was the most recent tock, and arguably the biggest one since Intel started down this road.



At a high level the Sandy Bridge CPU architecture looked unchanged from prior iterations. Intel still put forth a 4-issue machine with a similar number of execution resources to prior designs. Looking a bit closer revealed that Intel completely redesigned the out-of-order execution engine in Sandy Bridge, while heavily modifying its front end. Sandy Bridge also introduced Intel's high performance ring bus, allowing access to L3 by all of the cores as well as Intel's new on-die GPU.

The Sandy Bridge GPU was particularly surprising. While it pales in comparison to the performance of the GPU in AMD's Llano, it does represent the first substantial effort by Intel in the GPU space. Alongside the integrated GPU was Intel's first hardware video transcoding engine: Quick Sync. In our initial review we found that Quick Sync was the best way to quickly transcode videos, beating out both AMD and NVIDIA GPU based implementations in our tests. Quick Sync adoption has been limited at best, which is unfortunate given how well the feature performed in our tests.

Sandy Bridge wasn't all rosy however. It was the first architecture that Intel shipped with overclocking disabled on certain parts. Any CPU without Turbo Boost enabled is effectively unoverclockable. Intel killed the low end overclocking market with Sandy Bridge.


The overclocking limits were a shame as Sandy Bridge spanned a wide range of price points. The low end Core i3-2100 was listed at $117 while the highest end Core i7-2600K came in at $317. While you can't claim that Sandy Bridge was overpriced at the high end, there's always room for improvement.

Despite abandoning Pentium as a high end brand with the 2006 release of Intel's Core 2 Duo, Intel has kept the label around for use on its value mainstream parts. Last year we saw only two Pentium branded Clarkdale parts: the G6950 and G6960. This year, powered by Sandy Bridge, the Pentium brand is a bit more active.

Processor Core Clock Cores / Threads L3 Cache Max Turbo Max Overclock Multiplier TDP Price
Intel Core i7 2600K 3.4GHz 4 / 8 8MB 3.8GHz 57x 95W $317
Intel Core i7 2600 3.4GHz 4 / 8 8MB 3.8GHz 42x 95W $294
Intel Core i5 2500K 3.3GHz 4 / 4 6MB 3.7GHz 57x 95W $216
Intel Core i5 2500 3.3GHz 4 / 4 6MB 3.7GHz 41x 95W $205
Intel Core i5 2400 3.1GHz 4 / 4 6MB 3.4GHz 38x 95W $184
Intel Core i5 2300 2.8GHz 4 / 4 6MB 3.1GHz 34x 95W $177
Intel Core i3 2120 3.3GHz 2 / 4 3MB N/A N/A 65W $138
Intel Core i3 2100 2.93GHz 2 / 4 3MB N/A N/A 65W $117
Intel Pentium G850 2.9GHz 2 / 2 3MB N/A N/A 65W $86
Intel Pentium G840 2.8GHz 2 / 2 3MB N/A N/A 65W $75
Intel Pentium G620 2.6GHz 2 / 2 3MB N/A N/A 65W $64
Intel Pentium G620T 2.2GHz 2 / 2 3MB N/A N/A 35W $70

The new Sandy Bridge based Pentiums fall into two lines at present: the G800 and G600. All SNB Pentiums have two cores (HT disabled) with 256KB L2 per core and a 3MB L3 cache. CPU core turbo is disabled across the entire Pentium line. From a performance standpoint, other than missing hyper threading and lower clocks - the Sandy Bridge Pentiums are very similar to Intel's Core i3.

Intel continues to separate the low end from the high end by limiting supported instructions. None of the Pentiums support AES-NI or VT-d. Other than higher clock speeds the 800 series only adds official DDR3-1333 support. The 600 series only officially supports up to DDR3-1066.

All standard Pentiums carry a 65W TDP. The Pentium G620T runs at a meager 2.2GHz and manages a 35W TDP. Regardless of thermal rating, the boxed SNB Pentiums come with an ultra low profile cooler:


These Pentium CPUs work in the same 6-series LGA-1155 motherboards as their Core i3/5/7 counterparts. The same rules apply here as well. If you want video out from the on-die GPU you need either an H-series or a Z-series chipset.

The Pentium GPU

When Intel moved its integrated graphics on-package with Clarkdale it dropped the GMA moniker and started calling it HD Graphics. When it introduced the Sandy Bridge Core i3/5/7, Intel added the 2000 and 3000 suffixes to the HD Graphics brand. With the Sandy Bridge Pentium, Intel has gone back to calling its on-die GPU "HD Graphics".
Despite the name, the Pentium's HD Graphics has nothing in common with Clarkdale's GPU. The GPU is still on-die and it features the same architecture as Intel's HD Graphics 2000 (6 EUs). Performance should be pretty similar as it even shares the same clock speeds as the HD 2000 (850MHz base, 1.1GHz turbo for most models). I ran a quick test to confirm that what Intel is selling as HD Graphics is really no different than the HD Graphics 2000 in 3D performance:

Intel HD vs 2000 vs 3000 - Crysis Warhead

All is well in the world.

Where the vanilla HD Graphics loses is in video features: Quick Sync, InTru 3D (Blu-ray 3D), Intel Insider (DRM support for web streaming of high bitrate HD video) and Clear Video HD (GPU accelerated post processing) are all gone. Thankfully you do still get hardware H.264 video acceleration and fully audio bitstreaming support (including TrueHD/DTS-HD MA).

Missing Quick Sync is a major blow, although as I mentioned earlier I'm very disappointed in the poor support for the feature outside of the initial launch applications. The rest of the features vary in importance. To someone building a basic HTPC, a Sandy Bridge Pentium will do just fine. Personally I never play anything in 3D, never use the Clear Video HD features and never use Intel Insider so I wouldn't notice the difference between a Sandy Bridge Pentium and a Core i5 for video playback.

Twitter Delicious Facebook Digg Stumbleupon Favorites More