Friday, July 29, 2011
Thursday, July 28, 2011
Wednesday, July 27, 2011
Tuesday, July 26, 2011
Monday, July 25, 2011
Sunday, July 24, 2011
Saturday, July 23, 2011
Thursday, July 21, 2011
Wednesday, July 20, 2011
Tuesday, July 19, 2011
A tipster dropped a link our way early this morning to a GLBenchmark 2.0 results browser page that might be of interest to many.
Hit that page and you'll find a complete run of the current GLBenchmark 2.0 suite, and a bit of a confirmation about what is and isn't the upcoming Motorola Droid 3. It isn't Tegra 2, it's OMAP 4430.
First off, the hardware specifications that we can glean from the information page seem to make it a virtually identical match with the specs of the Motorola Milestone 3, or XT883. That means a qHD 960x540 display likely 4" in size, hardware five-row QWERTY keyboard, Android 2.3.4, a possible 1 GB of LPDDR2 (512 MB is the other less-desirable, rumored number), 8 MP rear camera, front camera, and of course an accompanying CDMA2000 baseband for Verizon.
Given the number of recent flagship Motorola product launches with Tegra 2 SoCs, starting with the Motorola Atrix, the Droid X2, and the now-delayed Droid Bionic it seemed that a flagship (read: QWERTY keyboard-packing) summer Motorola Droid launch with Tegra 2 was inevitable. From the results page, it now seems that the Motorola Droid 3 will include a 1GHz OMAP 4430 SoC with PowerVR SGX 540 graphics, and not a Tegra 2.
The model in the results browser is codenamed "Solana" which matches the codename we've heard about for the Verizon-bound Motorola Droid 3. The photos below are of the XT883 which the Motorola Droid 3 will undoubtedly bear an uncanny resemblance to.
Motorola Milestone 3 - XT883 for China (courtesy: Motorola Mobility)
The benchmarks themselves paint an interesting picture, and while we're at it I've tossed in some other devices we have in-hand but haven't finished our full reviews of yet for comparison with these results from the Motorola Droid 3. Again GLBenchmark 2.0 runs we're reporting here are at native resolution for the respective devices, which we've now included in the description line for comparison. For a quick refresher, WVGA is 800x480, FWVGA is 854x480, and qHD is 960x540.
I've highlighted the previous Motorola Droid 2 results (which we reviewed last year) just for fun to illustrate how far we've come in one calendar year.
The Motorola Droid 3 results are impressive, edging out the Adreno 220-packing HTC Sensation and EVO 3D in the more demanding Egypt test, but not quite in Pro. No doubt this is thanks at least partially to the newer Imagination drivers which give a small performance boost.
We've had our hands on quite a few gaming notebooks here, but most of the time they're Clevo-based machines. These aren't necessarily bad notebooks; they're fast, typically have good screens, and they get the job done.
Yet they also have some persisting drawbacks: build quality isn't often that hot, the battery is a glorified UPS system, and they feature some of the worst keyboards on the market. ASUS, MSI, Toshiba, and HP all offer fairly compelling alternatives, and today Alienware brings us a particularly interesting contender in the form of the M17x R3.
Truth be told, I was ambivalent about laying hands on the M17x R3. Gaming notebooks can tend to be gaudy affairs, and Alienware's notebooks (at least on the shelf) are practically exemplars of this goofy kind of excess. But there's something to be said for a little bling, and if the whole thing feels right, who's to really complain if it looks like the gaming equivalent of a racecar bed?
Performance-wise, it's definitely going to feel right. Alienware has upgraded the M17x R3 with Sandy Bridge processors, and graphics options start at the AMD Radeon HD 6870M, upgradeable to the NVIDIA GeForce GTX 460M. Or you can go for the big daddy like our review sample has: the AMD Radeon HD 6970M.
Alienware M17x R3 Gaming Notebook
Processor Intel Core i7-2720QM
(4x2.2GHz + HTT, 3.3GHz Turbo, 32nm, 6MB L3, 45W)
Chipset Intel HM67
Memory 4x2GB Hynix DDR3-1333 (Max 4x4GB)
Graphics AMD Radeon HD 6970M 2GB GDDR5
(960 stream processors, 680MHz/3.6GHz core/memory clocks, 256-bit memory bus)
Display 17.3" LED Glossy 16:9 1080p (1920x1080)
LG Philips LGD 02DA
Hard Drive(s) 2x Seagate Momentus 750GB 7200-RPM HDD in RAID 0
Optical Drive Slot-loading Blu-ray/DVDRW Combo (HL-DT-ST CA30N)
Networking Atheros AR8151 PCIe Gigabit Ethernet
Intel Centrino Ultimate-N 6300 a/b/g/n
Internal WirelessHD (with external receiver included)
Audio IDT 92HD73C1 HD Audio
S/PDIF, mic, and two headphone jacks
Battery 9-Cell, 11.1V, 90Wh
Front Side N/A (Speaker grilles)
Right Side MMC/SD/MS Flash reader
Slot-loading optical drive
2x USB 2.0
eSATA/USB 2.0 combo port
Left Side Kensington lock
eSATA/USB combo port
2x USB 3.0
S/PDIF, mic, and two headphone jacks
Back Side AC jack
2x exhaust vents
Operating System Windows 7 Home Premium 64-bit
Dimensions 16.14" x 11.96" x 1.75-1.77" (WxDxH)
Weight ~9.39 lbs
Extras 3MP Webcam
Backlit keyboard with 10-key
Flash reader (MMC, SD/Mini SD, MS/Duo/Pro/Pro Duo)
Warranty 1-year standard warranty
2-year, 3-year, and 4-year extended warranties available
Pricing Starting at $1,499
Price as configured: $2,503
The Sandy Bridge processor at the heart is the major part of this refresh of the M17x. You can custom order all the way up to the Intel Core i7-2820QM (the 55-watt i7-2920XM isn't available), but the i7-2720QM presents a nice balance of performance and value. With a 2.2GHz nominal clock rate capable of turbo-ing up to 3.3GHz on a single core (or 3GHz on all four cores), the i7-2720QM should offer more than enough processing horsepower. Alienware also joins four DIMM slots instead of two to the i7's memory controller allowing for a maximum of 16GB of memory, enough to get some serious work done.
Handling graphics duties is the AMD Radeon HD 6970M, basically a mobile version of the desktop Radeon HD 6850. This is arguably the fastest mobile GPU currently available, duking it out with NVIDIA's GeForce GTX 485M for the top slot. It features 960 stream processors, a 680MHz core clock, and 2GB of GDDR5 clocked to an effective 3.6GHz on a 256-bit bus for a staggering 115.2 GB/sec of memory bandwidth. The M17x R3 also supports GPU switching, allowing you to switch to the IGP while on the battery to substantially improve running time. Unfortunately the solution here isn't quite as automatic or seamless as NVIDIA's Optimus, but it gets the job done.
The M17x R3 sports two drive bays, but the storage options offered on the Dell website leave something to be desired. The default configuration is a pair of 320GB, 7200-RPM hard drives in RAID 0 and in fact outside of a single 256GB SSD option, everything is RAID 0. Understanding that the M17x R3 should be spending most of its life on your desktop, this is nonetheless a disappointing set of options. Ideally you'd want an SSD serving as the boot drive and a HDD handling mass storage duties. I use a RAID 0 on my desktop for my scratch video drive and gaming drive, but honestly for the latter it's not a substantial improvement. In a notebook, even one that will live its life on flat surfaces, this is still a questionable choice.
From here there are three fairly sizable selling points for the M17x R3: HDMI in, wireless display, and 3D. The HDMI input is only 1.3 and can't support 3D should you configure the M17x with the 120Hz 3D screen option, but for connecting your PS3 or Xbox 360 it's sufficient and works basically as a passthrough to the laptop screen. The built-in wireless display connectivity isn't tied to Intel's Wi-Di but instead uses WiHD. Like most wireless display technologies, though, I had some trouble getting this one working right. While Vivek is a big fan of things like Intel's Wi-Di, I'm not really sold on it; you still have to connect a receiver box to your TV's HDMI port, and frankly, if you can afford to buy this notebook, you can afford to buy a dedicated blu-ray player with Netflix and Hulu functionality built in. Finally, there's a 120Hz 3D-capable panel option for those so inclined, but unfortunately our review unit didn't include it so there's no way to test it.
Google will support HTC in its patent battle against Apple, Eric Schmidt says. He is a former Apple director. Schmidt accused Apple of resorting to lawsuits instead of innovations and said "We will make sure" HTC doesn't lose. An analyst said Google needs to protect Android licensees, and the battle between HTC and Apple could be long.
Google's executive chairman is sticking his mouth into the middle of the Apple vs HTC battle. Eric Schmidt defended HTC against a company where he used to sit on the board of directors.
At Google's Mobile Conference in Tokyo on Tuesday, Schmidt pledged Google's support Relevant Products/Services for HTC in its patent battle against Apple. HTC is appealing a U.S. International Trade Commission ruling that the company trespassed on two Apple patents with its Google Android-based smartphones. Apple had charged that HTC violated 10 patents.
Grace Lei, general counsel for HTC, said the company will "vigorously fight these two remaining patents through an appeal before the ITC commissioners who make the final decisions."
"We have seen an explosion of Android devices entering the market and, because of our successes, competitors are responding with lawsuits, as they cannot respond through innovations," Schmidt said. "I'm not too worried about this."
Although he didn't say exactly how Google would help, whether via attorneys or financial help or some other support, Schmidt did say, "We will make sure they don't lose, then."
Google has a lot at stake. The company just announced that it sees 550,000 Android activations a day, outpacing Apple's iOS, Microsoft's Windows Relevant Products/Services Phone 7, and the BlackBerry OS.
Schmidt said of a once-close ally, without naming names: "The big news in the past year has been the explosion of Google Android handsets, and this means our competitors are responding. Because they are not responding with innovation Relevant Products/Services, they're responding with lawsuits. We have not done anything wrong and these lawsuits are just inspired by our success."
Avi Greengart, an analyst at Current Analysis, said if Google believes that HTC is being sued over intellectual property that is actually core Android functionality, then it would be understandable for Google to ensure its licensees are protected.
"We are at a stage of the market where intellectual property is incredibly important and valuable, and everyone is suing everyone else," Greengart said. "There are some companies at which IP is an important part of the overall corporate value. If you don't properly monetize that intellectual property, your shareholders get understandably upset. The way you monetize that is to get people to license your patents. In cases where that doesn't work, you bring them to court."
Greengart said some IP suits are about licensing terms. In such cases, he said, it's about trying to level the playing field and taking full advantage of the innovations the company brought to market and getting full value for them.
"Patent suits tend to be a long process," Greengart said. "If you follow cases on a day-by-day blow by blow, you often end up assuming things are more dire than they actually are for any of parties involved."
Monday, July 18, 2011
Smartphones are fast becoming the personal convergence device, replacing the digital cameras and camcorders as well as the portable PMPs (personal media players).
However, there is another aspect to the convergence paradigm being played out at the residential level. Today's connected home has multiple devices for various purposes. For example, I have a modem and a wireless router to make sure that all my devices can connect to the Internet. I have also been through umpteen boxes bringing media from online streaming services as well as my NAS to the TV. Home control and automation systems (helping with energy management, home monitoring etc.) are also installed by service providers and commercial integrators. Wouldn't it be nice to have a single device to manage all these aspects? The icing on the cake would be if such a device were to be easily installed by the consumers themselves (saving money in the process for the service providers like AT&T, Comcast and Verizon).
Today, Sigma Designs is launching [PDF] Skini, a reference platform for products aimed at changing the way consumers interact with the living room. A thin client with over-the-top (OTT) and cable / telco connectivity in one box, the Skini platform also benefits from being a part of the Z-Wave home control solution. The ease of use of the products based on this platform is supposed to make professional installation unnecessary.
The current crop of Smart TVs may bring the Internet to your living room. Unfortunately, putting OTT services like Netflix directly into a device that has a longer update cycle (consumers don't go out and buy a TV every two years) is not really a good idea. For example, your current TV might probably not support HD audio passthrough back to an AV receiver. It is not implausible that Netflix might be able to stream videos with HD audio in the future. It is beneficial for consumers to just update their WDTV-like box for enjoying the new feature, rather than buy a new TV. This makes the television a bad candidate for the home convergence device.
A set top box (not the traditional ones only, but includes media streamers too), on the other hand, has a much shorter update cycle. It makes definite sense for this to be the convergence device. Now, let us take a look at what Sigma Designs has put in Skini towards this:
* Video Decoder [ Sigma Designs 8670 ]
This is the heart of the STB. Using a decoder chip targeted towards the IPTV market (the next-gen version of the SMP 8652 chip used in the Dune Lite) lowers the power consumption and BOM cost. For example, this chip requires only 32b DRAM instead of the 64b used in chips like the SMP 8643 / SMP 8655. Readers should note that this is not a decoder chip targeted towards the videophiles (so don't expect Blu-ray capabilities even for ISOs), but more towards OTT applications like Netflix. Sigma is undoubtedly one of the leading decoder SoC manufacturers for connected set top boxes, and we have no doubt that the appropriate SoC has been chosen for this platform.
* Powerline Networking [ CG2110 ]
This enables Skini's networking capabilities. The platform doesn't need to be equipped with an Ethernet port or a Wifi solution because it can grab content from the network using this section. Conforming to the 200 Mbps HomePlug AV standards, the CG2110 is a product from Sigma's CopperGate acquisition. With ClearPath technology (which I had covered here earlier), this product looks pretty capable. However, my main source of worry is that I am yet to spot this chip in any retail shipping powerline networking product. Sigma Designs indicated that products based on the chip are not in the retail market, but only being shipped to service providers. I am trying to find out more information about who they are.
* Z-Wave Wireless RF Home Control
Z-Wave is the defacto standard for wireless RF communication within the home. With the integration of Z-Wave technology, the Skini becomes another part of the connected home. Due to its RF nature, any product based on Skini can be easily hidden behind the display, and yet be remote controlled. In addition, becoming a part of the Z-Wave mesh network also enables it to control other connected devices.
The Skini platform looks to be a good first step towards making convergence in a residential setting realistic and affordable. However, the networking gateway aspect remains incomplete as yet. Consumers using devices based on Skini still need a router / modem to connect their network to the outside world. Residential gateways based on G.hn silicon can interface with either coax (cable) or phone lines (DSL) to connect to the service providers' networks. We will have to wait for Sigma's G.hn products to mature before the next convergence step is taken.
All the technologies involved in Skini seem to be based on market proven technologies. Only the powerline networking segment continues to remain an unknown entity outside the Sigma Designs labs. ODMs picking up the Skini platform can be expected to launch products in the market around a year or so from now.
UPDATE: The official press release is linked here [PDF]. We were given to understand during the briefing last week that the reference platform would be named 'Skini'. However, the PR today doesn't have any special tag for it. We are letting our news piece stand as-is.
We’ve just returned from sunny Bellevue, Washington, where AMD held their first Fusion Developer Summit (AFDS). As with other technical conferences of this nature such as NVIDIA’s GTC and Intel’s IDF, AFDS is a chance for AMD to reach out to developers to prepare them for future products and to receive feedback in turn.
While AMD can make powerful hardware it’s ultimately the software that runs on it that drives sales, so it’s important for them to reach out to developers to ensure that such software is being made.
AFDS 2011 served as a focal point for several different things going on at AMD. At its broadest, it was a launch event for Llano, AMD’s first mainstream Fusion APU that launched at the start of the week. AMD has invested the future of the company into APUs, and not just for graphical purposes but for compute purposes too. So Llano is a big deal for the company even though it’s only a taste of what’s to come.
The second purpose of course was to provide sessions for developers to learn more about how to utilize AMD’s GPUs for compute and graphics tasks. Microsoft, Acceleware, Adobe, academic researchers, and others were on hand to provide talks on how they’re using GPUs in current and future projects.
The final purpose – and what is going to be most interesting to most outside observers – was to prepare developers for what’s coming down the pipe. AMD has big plans for the future and it’s important to get developers involved as soon as is reasonably possible so that they’re ready to use AMD’s future technologies when they launch. Over the next few days we’ll talk about a couple of different things AMD is working on, and today we’ll start with the first and most exciting project: AMD Graphics Core Next.
Graphics Core Next (GCN) is the architectural basis for AMD’s future GPUs, both for discrete products and for GPUs integrated with CPUs as part of AMD’s APU products. AMD will be instituting a major overhaul of its traditional GPU architecture for future generation products in order to meet the direction of the market and where they want to go with their GPUs in the future.
While graphics performance and features have been and will continue to be important aspects of a GPU’s design, AMD and the rest of the market have been moving towards further exploiting the compute capabilities of GPUs, which in the right circumstances are capable of being utilized as massive parallel processors that can complete a number of tasks in the fraction of the time as a highly generalized CPU. Since the introduction of shader-capable GPUs in 2002, GPUs have slowly evolved to become more generalized so that their resources can be used for more than just graphics. AMD’s most recent shift was with their VLIW4 architecture with Cayman late last year; now they’re looking to make their biggest leap yet with GCN.
GCN at its core is the basis of a GPU that performs well at both graphical and computing tasks. AMD has stretched their traditional VLIW architecture as far as they reasonably can for computing purposes, and as more developers get on board for GPU computing a clean break is needed in order to build a better performing GPU to meet their needs. This is in essence AMD’s Fermi: a new architecture and a radical overhaul to make a GPU that is as monstrous at computing as it is at graphics. And this is the story of the architecture that AMD will be building to make it happen.
Finally, it should be noted that the theme of AFDS 2011 was heterogeneous computing, as it has become AMD’s focus to get developers to develop heterogeneous applications that effectively utilize both AMD’s CPUs and AMD’s GPUs. Ostensibly AFDS is a conference about GPU computing, but AMD’s true strength is not their CPU side or their GPU side, it’s the combination of the two. Bulldozer will be the first half of AMD’s future APUs, while GCN will be the other half.
University students will be able to rent textbooks through Amazon's Kindle Textbook Rental program -- even if they don't own a Kindle e-reader. With free Kindle reader apps, Kindle Textbook Rental participants will be able to read the textbooks on multiple computing devices. Besides saving money, students will be able to keep study notes.
Amazon.com has launched Kindle Textbook Rental -- a new program that enables university students to rent the textbooks they need for courses while realizing substantial savings off the list prices of selected titles from Elsevier, John Wiley & Sons, Taylor & Francis, and other textbook publishers. Even better, students don't need to own a Kindle e-reader to participate in the program.
Students will be able to access Relevant Products/Services Kindle editions of textbooks they rent on PCs, Macs, iPads and many smartphones by downloading free Kindle reading apps. According to the online retail giant, tens of thousands of textbooks are available for the 2011 school year.
Students will be able to select the rental period that best applies to their needs: From 30 days up to 360 days. Students now have the "option to rent Kindle textbooks and only pay for the time they need -- with savings up to 80 percent off the print list price on a 30-day rental," said Amazon Vice President Dave Limp.
Text Highlights and Notes
Students can determine whether Kindle editions of the textbooks they need are available by searching Amazon's online Textbooks Store, where they can even preview portions of selected titles using Amazon's "look inside" feature. If a title is available for rental, the student merely needs to select the desired rental period and click on the "rent now" button.
For example, the Kindle edition of the textbook Operating System Concepts is priced at $70.80 but can be rented for 30 days for $29.46. Additionally, students in need of additional time will be able to extend the rental period by paying for the additional days they require.
Amazon's Whispersync technology synchronizes the reader's bookmarks as well as saves notes and highlighted text across multiple devices. What's more, the new program gives students the ability to search for keywords and phrases within any textbook.
Rent Once, Read Everywhere
Students will be able to retain their personalized notes as well as highlighted text. So even after the rental period ends, they will continue to have access to their study notes and highlighted text -- for review before taking an exam Relevant Products/Services, for example.
"Normally, when you sell your print textbook at the end of the semester, you lose all the margin notes and highlights you made as you were studying," Limp said. "We're extending our Whispersync technology so that you get to keep and access all of your notes and highlighted content in the Amazon Cloud, available anytime, anywhere."
Kindle Textbook Rental delivers a "rent once, read everywhere" experience similar to what consumers get when they purchase the Kindle edition of any book title. So students will be able to seamlessly access the textbooks they rent across all the computing Relevant Products/Services devices for which free Kindle reading apps are available.
Amazon's Kindle technology opens the door to the possibility that two or more students could form a study group and share Relevant Products/Services a single Amazon account. In theory, the participants in a study group would be able to share their insights via the notes they enter as well as the text segments they highlight.
Sunday, July 17, 2011
Intel released its second-generation Core CPUs back in January. Unfortunately, the excitement generated by the release of the fastest mainstream desktop processors was quickly dampened by the Cougar Point chipset recall.
To be clear, this issue affected only the earliest Sandy Bridge-compatible motherboards, and not the Sandy Bridge CPUs themselves. This issue is now fixed—there are no defective motherboards available through reputable North American retailers like Newegg and Amazon. In the almost half-year since the initial Sandy Bridge CPU release, the platform has matured, with CPU variants available for almost every budget and a number of niches, as well as motherboard chipsets with a variety of feature sets and in form factors from mini-ITX to extended-ATX. Succinctly, the second-gen Core CPUs are astonishingly powerful and sip electricity. As Anand aptly described them, “architecturally it’s the biggest change we’ve seen since Conroe.” I agree with Anand—not since I upgraded from an AMD Athlon X2 3800+ to an Intel Core 2 Duo E6600 at the end of 2006 have I been so impressed by a new CPU as I have by the Core i7-2600K.
This is the first guide I’ve written for AnandTech that will not be ‘fair and balanced’ for both AMD and Intel. I hoped this month’s guide would detail higher-end builds featuring and comparing AMD’s Bulldozer CPUs and Intel’s Core i5 and i7 chips, but unfortunately, AMD’s release of its high-end desktop Bulldozer SKUs is now delayed until September. The midrange Llano desktop APUs are scheduled for retail availability in early July, and Llano-based laptops are already showing up here and there online (though as of the time of writing, they are not available for actual sale). Thus, AMD’s entire product line will be refreshed within the next few months. With the imminent release of radically new APUs and no currently available AMD CPUs that can compete with Intel’s higher-end CPUs, this month’s guide focuses on the second-generation Intel Core processors. I simply don’t think it makes much sense to build an AMD system at least until Llano’s desktop release—unless you need a budget rig and you need it right now. And lest I be accused of favoritism, next month’s guide will likely focus on Llano-based desktop computers.
It’s also a great time to build an Intel-based computer. The successor to LGA 1155 (the Sandy Bridge socket), LGA 2011, is not due out until late this year, and looks to supersede LGA 1366 at Intel’s highest-end of the desktop CPU spectrum. Other than supporting Sandy Bridge-E CPUs, LGA 2011 will offer PCIe 3 (which current GPUs can’t take advantage of) and native USB 3.0 (even though third-party USB 3.0 controllers are already shipping on many Intel and AMD motherboards). Considering how capable the Core i5-2500K and Core i7-2600K are today, it’s unlikely Sandy Bridge-E will field any model that’s astonishingly faster than what’s already available. Thus, if you buy a Core i7-2600K now, you’ll be at the near pinnacle of desktop computing for at least 5-6 months. I think there are times to buy and times to wait. It’s a bad idea to buy right before a lineup refresh (as is the case with AMD today), but it’s also unwise to delay building a system to hold out for the next big thing when that’s half a year away and unlikely to be that much better!
I mentioned in our Mid-Range SSD Roundup that most SSD vendors like sampling the best balance of capacity/performance when it comes to SSD review samples.
For the SandForce SF-2281 with 25nm NAND that just happens to be 240GB. Unfortunately there's a pretty big fall off in performance when going from 240GB to 120GB due to the decrease in total number of NAND die (8GB per die x 32 die vs. 16 die). I've explained this all before here.
Enter OCZ's MAX IOPS drive. Using 32nm Toshiba Toggle NAND instead of 25nm IMFT ONFI 2.x NAND the die capacity drops to 4GB, which means you get twice as many die per NAND device. The end result? 240GB Vertex 3 performance for slightly more than 120GB Vertex 3 pricing.
I ordered the 120GB MAX IOPS drive at the beginning of the week and just got it in yesterday so I've only had a small amount of time to test with it thus far. Check out the 120GB MAX IOPS drive vs. the Intel SSD 510 in Bench using our 2011 storage test suite. Expect the full review in the coming days.
Chromebook laptops are Google's way of showcasing its "cloud computing" philosophy, in which everything you need is available on the Internet. Google believes storage and services are better handled by Internet-connected data centers located far from you. Computers running Windows tend to keep files and programs on individual machines.
New laptops running Google's Chrome operating system offer a new approach in portable computing Relevant Products/Services: Games, productivity Relevant Products/Services tools and anything else you might need are handled by distant computers connected to the Internet.
With this method, you don't store Relevant Products/Services data Relevant Products/Services on a hard drive inside the computer Relevant Products/Services. That streamlines things, at the cost of having stronger, standalone applications that normally handle these tasks. But the trade-off might be worth it for the more casual consumers of online content.
Google already has a good variety of online services that will be key to any success for the set of laptops known as Chromebook. There's Gmail for messages, Google Plus for sharing photos and links and Google Docs for word processing, spreadsheets and other common tasks. Other companies also make free programs, which run through Google's Chrome browser.
All that is important because you can't install Microsoft Office or other software suites on the Chromebook. Everything done on the Chromebook has to be Web-based.
Chromebook is Google's way of showcasing its "cloud Relevant Products/Services computing" philosophy, in which everything you need is available on the Internet. Google believes storage Relevant Products/Services and services are better handled by Internet-connected data centers located far from you. By contrast, computers running Microsoft's Windows Relevant Products/Services tend to keep files and programs on the individual machines in front of you.
Samsung Electronics Co. and Acer Inc. are making the first Chromebooks using Google's Chrome Web browser and an underlying operating system based on Linux.
Samsung's cheaper, Wi-Fi-only model retails for $429. It comes co-branded with Google's Chrome logo on the cover. It has two USB ports and slots for an SD memory card and a SIM phone card. You can connect an external monitor to it. You can also connect to the Internet wirelessly through Wi-Fi, but there's no Ethernet port to allow wired connections to a network Relevant Products/Services or Bluetooth capability to connect to untethered external devices.
For $70 more, you can get a model that can connect through Verizon's 3G cellular network when Wi-Fi isn't an option. That's the model I tested, though I didn't end up needing the 3G capabilities because I always had Wi-Fi at work, at home and in cafes.
The unit I tried only had a 16 gigabyte solid state storage drive, but that's fine. I wasn't planning on hoarding video Relevant Products/Services clips or music files. Documents, for the most part, are supposed to be stored online as part of Google's cloud philosophy.
Chromebook is a lean, mean browsing machine Relevant Products/Services primarily because it urges users to move away from the local storage of content and data. Google's approach is to have you store your photos in a Web-based album such as Picasa, rather than in your "My Pictures" folder on your machine. Google Docs can store your writings and Google's Music beta (still invite-only at this stage) is positioned to handle your music collection.
Although you'd think it be slower storing your files elsewhere, the experience is actually faster because the Chrome system doesn't have to be loaded with programs handling various tasks. You simply call those up online as you need them.
This approach will require faith. There is certainly more control and better access Relevant Products/Services to storing content locally, and there's more privacy as well. With its growing suite of services, though, Google is betting some habits will change with time.
The Chromebook took me to my login screen less than five seconds after turning it on. Less than five seconds later, I was staring at the Chrome browser and an initial offering of apps such as YouTube, Google Talk and Gmail. With my home Windows 7 install, I would likely still be starting at the Windows start-up logo in this same time frame.
I began by adding some of my own favorites to the browser, which essentially served as my home screen for launching apps. I pulled several apps from Google's Chrome Web Store.
Tweetdeck was among the better Chrome apps for displaying my Twitter feed. Adjusting the Tweetdeck application to full screen delivered an experience that is almost the same as what I'd get when using Tweetdeck's standalone application with a desktop Relevant Products/Services PC Relevant Products/Services.
Another decent app for Chrome is Wikihood. It's a Wikipedia-styled page that uses your wireless Relevant Products/Services connection to determine your location and then provides you with some fast facts about the vicinity you're in. As I sat sipping coffee in an Atlanta cafe, Wikihood revealed to me that I was near the site of the Atlanta Campaign, a series of battles fought during the Civil War around Atlanta. There was links to information about the area in case I decided to explore on foot.
Popular Science has a slick app for Chrome, though it's more about form than function. Upon launching, the app delivered the magazine's online articles with beautiful photos and artist's renderings of scientific topics that covered the entire screen. The articles aren't extremely long; more than a tweet but shorter than a 3,000-word long-form piece that some of the magazine's readers might be accustomed to.
Aside from the apps, there isn't too much personalization you can do here. There is no desktop to dress up with family photos or high-resolution screenshots from "Avatar." There is no Microsoft Quick launch toolbar or Apple Dock for accessing frequently used programs. It was just me and Chrome, and this quieter approach wasn't half bad.
There is a media player for playing content such as music and video stored on an SD card, which can be inserted into a slot at the front left corner of the Chromebook. The software didn't have many features, such as equalizer settings, but it worked fine when I wanted a little background music.
Not all is rosy with the Chromebook. At one point I lost the wireless connection at my office, and the online magazine I was reading suddenly rendered a lot of broken links. I edit a lot of photos and video, and those tasks just aren't possible with the degree of control I'm used to without some standalone applications.
When you're offline with the Chromebook, you are truly going off the grid and you're not likely to accomplish much of anything. It's a brick without a connection to the cloud.
The Chromebook isn't the best choice as your only computer, but it's a fine second computer for the type of casual use that is becoming the primary activity for many people busy living in their social graph.
Saturday, July 16, 2011
In our Intel roadmap article published in May, we shortly previewed Intel's upcoming 700 Series SSDs. Back then there wasn't much to talk about as very few specs were known. Today we have some additional details to share, thanks to German site ComputerBase.de.
Intel SSD 700 Series
Series 710 720
Codename Lyndonville Ramsdale
Capacities (GB) 100/200/300 200/400
NAND type 25nm MLC-HET 34nm SLC
Cache (DRAM) 64MB 512MB
Interface SATA 3Gb/s PCIe 2.0
Read speed 270MB/s 2200MB/s
Write speed 210MB/s 1800MB/s
4KB read 35 000 IOPs 180 000 IOPs
4KB write 3 300 IOPs 56 000 IOPs
Power (active/standby) 4W/0.095W 25W/8W
Security AES-128 encryption AES-256 encryption
Data path protection LBA tag checking End to end data protection
I want to start off by saying that these SSDs are aimed at enterprise use. If you want an SSD for your gaming rig, you should look at our mid-range SSD roundup for example.
The Intel 700 Series is meant to replace the X25-E lineup, Intel's enterprise series, which hasn't been updated since late 2008 so it's long overdue. However, neither of these is an exact successor. The 710 Series is closer with its 2.5" form factor and SATA 3Gb/s. The 710 Series is actually pretty close to the 320 Series in terms of specs: sustained write is slightly higher but random performance is a bit lower. The biggest difference between the 320 and 710 series is the NAND type. 320 Series uses regular MLC that you can find inside any mainstream SSDs; 710 Series is Intel's first enterprise level SSD to use MLC NAND, but not just any kind of MLC—it will use MLC-HET NANDs. MLC-HET offers more write cycles per cell so longetivity is increased, which is crucial for enterprises. The only downside is that MLC-HET will only last for 3 months after all write cycles have been used, whereas normal MLC will last for 12 months. However, this shouldn't be an issue due to the increased amount of write cycles. For the record, MLC-HET with 20% over-provisioning (OP) appears to offer roughly 65 times greater endurance than normal MLC.
The 720 Series will be Intel's first PCIe SSD. To take full advantage of it, you will need at least a PCIe 2.0 x8 slot since a x4 slot will only provide up to 2GB/s while the 720 Series provides read speeds of up to 2.2GB/s. It will use 34nm SLC NANDs, which is pretty common for high-end enterprise SSDs due to SLC's much better endurance. The 720 Series promises up to 36PB (yes, as in 36000TB) of 8KB writes for the 400GB SSD. That is nearly 1000 times more durable than 25nm MLC and over 10 times more durable than 25nm MLC-HET.
320 Series 710 Series 720 Series
Capacity (GB) 80 160 300 100 200 200 400
Endurance (TB) ? ? ? 500/900 (20% OP) 1000/1300 (20% OP) 18000 36000
Reliability (MTBF-hours) 1.2 million 2.0 million N/A
One of the biggest and most needed upgrades from X25-E is the much better encryption support. X25-E offered only ATA password protection, which is way too vulnerable by today's standards, especially when considering that even the mainstream 320 Series supports 128-bit AES. 720 Series will take that one step even further by supporting 256-bit AES encryption. This is very important for enterprises handling confidential data; you don't want your data get into the wrong hands and you are ready to pay the premium for the best protection.
This update is essential for Intel to stay competitive in the enterprise SSD market. It has already been 2.5 years since the last update and when considering the progress of SSDs during this time, it's surprising that the update hasn't take place sooner. There isn't much news on the release schdule so the best we've got is what we have already posted: Q2'11 for 710 Series and Q4'11 for 720 Series SSDs. The 710 Series seems to be the low-end offering and it's basically the same as the 320 Series with improved endurance. The 720 Series, on the other hand, is an SSD for heavy enterprise use with features making it suitable for such use. OCZ has pretty much been the dominator of PCIe SSD market but Intel's 720 Series could offer some serious competition in the high-end PCIe SSD market.
The regular refreshes that come from notebook vendors aren't often the stuff of exciting news. It's generally a processor update, maybe a slight change in shell design.
With Toshiba's 2011 refresh that's not entirely untrue, but this year they've timed their update to coincide with the launch of AMD's Llano APU and NVIDIA's launch of the GeForce GTX 560M. Their Fusion finish is also getting a much needed update along with the top-of-the-line Qosmio. Bottom line: there's a lot going on at Toshiba.
Toshiba's Satellite is broken down into three different lines: the budget-minded C-series, the mainstream L-series, and the performance-oriented P-series. Starting from the bottom, we have the C-series, which launched earlier this year and currently offers AMD's Zacate processors, from the C-50 up to the E-350. The only major update here is that Toshiba will now be shipping a 17.3" model, putting a large desktop replacement notebook in the reach of more budget-oriented consumers. These start at $379.99, and the Toshiba rep noted that the 17.3" C-series model would be around $499. Yes, Brazos in a 17.3" notebook.
When you bump up to the L series, you get access to Sandy Bridge, but now there's also Llano. Sandy Bridge-based Intel Core i3 and i5 processors will be available, with AMD-based models now sporting A4 and A6 dual- and quad-core processors. Notebooks will range from 13.3" up to 17.3".
The updated Fusion finish remains one of glossy plastic's last strongholds in retail, but the textured appearance makes it far less liable to pick up fingerprints and all the usual mess that comes with gloss; unfortunately Toshiba is still sticking with the glossy keyboard. Finally, the line will come with USB charging, a wide variety of colors (including a very attractive brushed aluminum blue as an alternative to the gloss), and in some configurations a Blu-ray drive. The 14" L745 series starts at $449.99, the 15.6" L755 series starts at $483.99, and the 17.3" L775 series starts at $579.99.
UPDATE: Toshiba let us know the 13" L735 will only be available with Intel processors. It's a shame; that form factor seems like a great place for Llano.
The P700 line may seem the most compelling, though. While these still sport the Fusion X2 finish, it's been toned down and the keyboard has been replaced by the slightly glossy island-style found on the Portege and the new Tecra lines. That keyboard is still a little bit problematic, but it's a major improvement on the older glossy flat keys. In addition, Toshiba implements Waves Audio and USB 3.0 along with USB charging across the entire line, and these notebooks will feature Sandy Bridge processors all the way up to i7 along with AMD's A6-3400M. Some configurations will also include WiMax, Blu-ray, and even NVIDIA GeForce GT 540M graphics with Optimus. The 14" P745 starts at $699.99, the 15.6" P755 starts at $629.99, the 17.3" P775 starts at $629.99, and if you want to make the jump to 3D there'll be a 17.3" P775 3D model at $1,199.99 that includes a 120Hz screen and active shutter glasses.
Last but not least is the new Qosmio X770. If you're like me you probably thought last generation's Qosmio was ostentatious at best, bulky and gaudy at worst. The X770 has had a major facelift and it's a real improvement. Red remains the signature color for the sleeker, slimmer new Qosmio but honestly, the red backlighting behind the keyboard looks downright evil, which may or may not be your cup of tea (it's mine). The 17.3" X770 comes equipped with a Core i5 or i7, a GeForce GTX 560M standard, and up to a 1080p screen. It starts at $1,199.99, but peaks with the X775 3D which comes equipped with a 120Hz screen and active shutter glasses at $1,899.99.
All of these notebooks are expected to become available by the end of the month, and we're planning on getting one of the new Qosmios in hand as soon as possible.
Remarks made by Microsoft's Andy Lees at the company's Worldwide Partner Conference are igniting rumors that the next-generation Xbox device could run Windows as its operating system. The expected time frame for this integration could be about four years, timing that would coincide with the expected refresh of the XBox and Windows 8.
Analysts are chewing over the implications of Microsoft's announced plans to create a unified Relevant Products/Services ecosystem for phones, computers and other devices. One thing that could mean is that the next generation Xbox device Relevant Products/Services could run on a version of Windows Relevant Products/Services.
Andy Lees, who heads up Microsoft's Windows Phone division, said at the company's Worldwide Partner Conference in Los Angeles that the time is approaching when there won't be an ecosystem for phones and tablets. "They'll all come together," Lees said, hinting at "a single ecosystem and not ecosystems themselves."
The expected time frame for this integration could be about four years, according to sources who spoke to The Escapist blog, timing that would coincide with the expected refresh of the XBox and Windows 8. Thisismynext's Nilay Patel also noted that Lees said the Redmond, Wash.-based software giant wants to provide "coherence and consistency" across devices, "particularly with Xbox." Sources told Thisismynext that Microsoft may even give up the name Windows, the company's best established trademark, for something entirely new during that 2015-2016 time frame.
Michael Inouye, digital home analyst for ABI Research, said reports of the Windows Xbox are "certainly credible," but he added, "the implementation of a cross-platform/device OS might not happen as some are envisioning. The branding might span devices as may some of the critical components, but I would expect the next Xbox to remain a 'separate' platform."
It is unlikely, he said, that the next Xbox will be able to install non-gaming or non-browsing software such as the Office productivity Relevant Products/Services suite.
"The Xbox will remain a game/media machine Relevant Products/Services first and foremost, which is why the Xbox and Xbox 360 operating systems were designed separately from Windows," said Inouye. "This isn't to say there aren't any shared components, but they are separate operating systems. I would expect some aspects like a Microsoft application storefront/service Relevant Products/Services to be cross platform/device as well as other features/services like Internet Explorer, social networking, video Relevant Products/Services marketplaces, etc. We are already seeing some of this with Xbox Live."
Controlling Your Experience
Another concern, he said, was that Microsoft must limit what users can install and do on its gaming platform to ensure the Xbox is optimized for media and games.
"In addition there are security Relevant Products/Services issues that also favor a closed/protected system as well -- e.g., content protection and protecting users from malware."
But Microsoft and other major players in the gaming industry may have to do something to ramp up interest in platform-based games: A new survey by NPD found that sales of games suffered a major drop in June, bringing in revenue of $995 million, compared to $1.1 billion during the same month in 2010, a 10 percent drop.
Software sales fell 12 percent, and game accessories fell 11 percent, a likely result of greater use of free or cheap downloadable game apps for tablets and smartphones.
Microsoft can worry less: The Xbox 360 dominated the market in May with 34 percent of all game revenues, the Wall Street Journal reported.
Friday, July 15, 2011
Back at CES 2011, Samsung showed us something that may have seemed a bit futuristic back then but which is now reality. They showed us a monitor that connects to your laptop wirelessly and on top of that, the monitor acts as a USB hub and the USB devices connect wirelessly too. Samsung calls this technology Central Station. You simply connect a small USB dongle to your laptop, take the laptop within the monitor’s range and your laptop automatically connects to the monitor and peripherals attached to it, wirelessly. You walk away and the monitor goes black. Pretty simple, right?
During the last few years, laptops have become powerful enough for people to use as their main computer. Sales figures support this too. If we look at for example Apple’s sales figures, twice as many desktops were sold compared to laptops in 2002. However, during the recent years, laptops have stolen a huge portion of the sales of desktops, making the situation the opposite of 2002: laptops now outsell desktops by a two-to-one ratio.
While laptops can now offer performance sufficient for some real work, there is one thing that they cannot offer: screen real estate. Most laptops have around a 15” screen, but resolutions vary a lot depending on the model and price range (1366x768 at the low-end, and up to 1920x1200 in higher-end models). 22-24” monitors with resolutions of 1920x1080 have more or less become the standard for desktops and such monitors can be had for ~150$ nowadays.
Because laptops lack screen real estate, a laptop with a separate monitor at home has become a very common setup. This gives you portability when on the go and a desktop-like setup when at home with extra screen size/resolution. However, this kind of setup has one big setback in that you need to plug in a bunch of cables when you get home to use your external monitor and other peripherals. With a desktop, you just turn the computer on or wake it up from the sleep mode; there's no hassling with cables required. While you could call one lazy if it’s too much to plug in a few cables, you often end up either using the laptop on the desk with the monitor all the time, or you never plug in the monitor and it sits on the desk unused.
Laptop docking stations have offered a solution, but there are several shortcomings. First off, not all laptops have a design suitable for docking (Apple’s laptops come to my mind). Second, each laptop dock is only suitable for certain laptop models, so if you buy a new laptop you most likely have to buy a new laptop dock as well. A new dock will easily set you back at least $100, which is quite a lot for an ugly piece of plastic. Finally, many consumer laptops don't even have suitable docks, which rules out a docking station completely.
This is where Central Station becomes interesting. You don't need to connect any cables and you don't need an ugly dock. Everything is wireless.
AMD announced the acquisition of ATI in 2006. By 2007 AMD had a plan for CPU/GPU integration and it looked like this. The red blocks in the diagram below were GPUs, the green blocks were CPUs. Stage 1 was supposed to be dumb integration of the two (putting a CPU and GPU on the same die). The original plan called for AMD to release the first Fusion APU to come out sometime in 2008—2009. Of course that didn't happen.
Brazos, AMD's very first Fusion platform, came out in Q4 of last year. At best AMD was two years behind schedule, at worst three. So what happened?
AMD and ATI both knew that designing CPUs and GPUs were incredibly different. CPUs, at least for AMD back then, were built on a five year architecture cadence. Designers used tons of custom logic and hand layout in order to optimize for clock speed. In a general purpose microprocessor instruction latency is everything, so optimizing to lower latency wherever possible was top priority.
GPUs on the other hand come from a very different world. Drastically new architectures ship every two years, with major introductions made yearly. Very little custom logic is employed in GPU design by comparison; the architectures are highly synthesizable. Clock speed is important but it's not the end all be all. GPUs get their performance from being massively parallel, and you can always hide latency with a wide enough machine (and a parallel workload to take advantage of it).
The manufacturing strategy is also very different. Remember that at the time of the ATI acquisition, only ATI was a fabless semiconductor—AMD still owned its own fabs. ATI was used to building chips at TSMC, while AMD was fabbing everything in Dresden at what would eventually become GlobalFoundries. While the folks at GlobalFoundries have done their best to make their libraries portable for existing TSMC customers, it's not as simple as showing up with a chip design and having it work on the first go.
As much sense as AMD made when it talked about the acquisition, the two companies that came together in 2006 couldn't have been more different. The past five years have really been spent trying to make the two work together both as organizations as well as architectures.
The result really holds a lot of potential and hope for the new, unified AMD. The CPU folks learn from the GPU folks and vice versa. Let's start with APU refresh cycles. AMD CPU architectures were updated once every four or five years (K7 1999, K8 2003, K10 2007) while ATI GPUs received substantial updates yearly. The GPU folks won this battle as all AMD APUs are now built on a yearly cadence.
Chip design is also now more GPU inspired. With a yearly design cadence there's a greater focus on building easily synthesizable chips. Time to design and manufacture goes down, but so do maximum clock speeds. Given how important clock speed can be to the x86 side of the business, AMD is going to be taking more of a hybrid approach where some elements of APU designs are built the old GPU way while others use custom logic and more CPU-like layout flows.
The past few years have been very difficult for AMD but we're at the beginning of what may be a brand new company. Without the burden of expensive fabs and with the combined knowledge of two great chip companies, the new AMD has a chance but it also has a very long road ahead. Brazos was the first hint of success along that road and today we have the second. Her name is Llano.
Despite a rocky economic landscape in numerous global markets, Google's earnings prove that the company's bets on its traditional advertising strategy and its new mobile advertising strategy are paying off handsomely. Google says its second-quarter earnings exceeded $9 billion, and that the company would continue making investments.
In good news for the tech sector, Google on Thursday announced earnings for the quarter ended June 30, 2011. Google reported revenues of $9.03 billion.
"We had a great quarter, with revenue up 32 percent year on year for a record-breaking over $9 billion of revenue," said Larry Page, CEO of Google. "I'm super excited about the amazing response to Google+ which lets you share Relevant Products/Services just like in real life."
At a company press conference, Page said the company would continue making significant investments while keeping financial management tight.
"Of course, I understand the need to balance the short term with the longer-term needs because our revenues and growth serve Relevant Products/Services as the engine that funds our innovation Relevant Products/Services," Page said. "But our emerging high-usage products can generate huge new businesses for Google in the long run, just like search, and we have tons of experience monetizing successful products over time."
By the Numbers
GAAP operating income in the second quarter totaled $2.88 billion, or 32 percent of revenues. That compares to GAAP operating income of $2.37 billion, or 35 percent of revenues, in the year-ago period. GAAP net income in the second quarter of 2011 was $2.51 billion, compared to $1.84 billion in the year-ago period.
Google-owned sites generated revenues of $6.23 billion, or 69 percent of total revenues. That's a 39 percent increase over second quarter 2010 revenues of $4.5 billion. And Google's partner sites generated revenues, through AdSense programs, of $2.48 billion, or 28 percent of total revenues, a 20 percent increase. Revenues from outside of the United States totaled $4.87 billion, 54 percent of total revenues. That's fairly consistent with the year-ago period at 53 percent.
Operating expenses, other than cost of revenues, were $2.97 billion in the second quarter of 2011, or 33 percent of revenues, compared to $1.99 billion in the second quarter of 2010, or 29 percent of revenues. In the second quarter, free cash flow was $2.60 billion.
Google vs. Apple
"Google's mobile Relevant Products/Services advertising strategy is paying off royally," said Charles King, principal analyst at Pund IT Relevant Products/Services. "Google reminds me of Microsoft in many ways. They have so many irons in so many fires they are bound to fail more often than they succeed. And the stories of those failures tend to dominate headlines because it's juicier news."
High-profile publicized failures aside, King said, Google's core business is healthy. And despite a rocky economic landscape in numerous global markets, King said, Google's earnings prove that the company's bets on its traditional advertising strategy and its new mobile advertising strategy are paying off handsomely.
"There were some stories published last week on rumors that Apple has drastically discounted the advertising rates for its own mobile advertising strategy. Businesses don't make radical price cuts in their service Relevant Products/Services offering when they are succeeding," King said. "They usually cut costs when those services are having a problem. It's kind of interesting to see Apple slicing rates a week or so before Google announces a fairly majestic bump in earnings driven at least in part by its own mobile strategy."
Thursday, July 14, 2011
New entrants to the keyboard app market include the SwiftKey X for smartphones and SwiftKey Tablet X. TouchType's latest apps offer a Fluency 2.0 language inference engine, which marks the latest evolution in artificial intelligence-driven text entry. Fluency 2.0 uses machine learning to predict and correct what the user will type.
Not satisfied with the keyboard that comes with your Android device Relevant Products/Services? Now, you have yet another option in what is becoming a crowded market: the SwiftKey X.
TouchType on Thursday rolled out two new Android keyboard apps: SwiftKey X for smartphones and SwiftKey Tablet X. The two virtual Relevant Products/Services keyboards are TouchType's second market offering. The company launched an Android keyboard app last year that has been downloaded more than 1.5 million times.
"The keyboard app market is definitely a growing space. I don't know how much room there is for everyone, and overall it remains to be seen whether it's better to allow innovation Relevant Products/Services around the keyboard or just provide a really good one to start with," said Avi Greengart, an analyst at Current Analysis. "Even Apple is providing a split keyboard option for the iPad Relevant Products/Services with the next iOS."
SwiftKey's Latest Bells
TouchType's latest apps offer an upgrade over the original, including what the firm calls a Fluency 2.0 language inference engine, which marks the latest evolution in artificial intelligence-driven text entry. Fluency 2.0 uses machine Relevant Products/Services learning to predict and correct by observing how a user composes text and then predicts what they will likely type next.
TouchType also taps a cloud Relevant Products/Services-based personalization service Relevant Products/Services that learns from a user's message history to predict and correct words. Then there's Touch Interaction Modeling, a technology that conducts real-time analysis of the user's touchscreen typing precision to improve word accuracy and predictions.
Like the upcoming iOS, SwiftKey Tablet X offers a split-key layout for thumb typing on the larger touchscreen tablet Relevant Products/Services form factor. Other new features include keyboard themes and customizations, support Relevant Products/Services for more than 20 languages, and the ability to type in up to three languages at once with language-aware auto-correction.
There are other options, or, as Greengart said, consumers can depend on hardware Relevant Products/Services makers to innovate. For example, Greengart said virtual keyboard app competitor Swype has the widest distribution. Swype allows users to move their finger from letter to letter to form words rather than press the virtual button.
"Many vendors have included Swype pre-loaded on some of their phones. With tablets there is even more real estate there to differentiate your keyboard," Greengart said. "Google encourages developers to plug in different keyboards and other elements into the Android OS."
Beyond Apple's upcoming innovations, Greengart also pointed to HP's work on the TouchPad keyboard, which has a numbers row.
"It's a trade-off as to how much of the rest of the screen you want to give up to have larger keys. Everyone's preferences may be slightly different," Greengart said. "HP gives you the option to change that around. Certainly the ability to have number keys available without having to go to a separate layout is convenient."
SwiftKey X supports all Android 2.X smartphones and retails at $3.99. SwiftKey Tablet X supports all Android 2.X and 3.X tablets and retails at $4.99.
Wednesday, July 13, 2011
If you haven't gotten the hint, today is all about Llano. The big story is of course Llano's notebook appearance; however, in the coming weeks you'll be hearing a lot more about Llano on the desktop as well. This is AMD's Socket-FM1, the brand new socket that'll be used for desktop Llano parts:
If you read our Computex coverage, the socket should look pretty familiar. Motherboard manufacturers all over Taiwan are busy readying their Socket-FM1 boards for retail release. In fact, there was so much interest in desktop Llano on behalf of the motherboard manufacturers that a number of Socket-FM1 boards and CPUs made their way off the island as Computex ended.
Existing Socket-AM3 coolers will work on FM1 motherboards
By now you may have already seen a lot of information leaked from AMD's Llano presentations, as well as its desktop strategy. In the past few days performance numbers have been revealed as well. While we're hard at work on our full review of AMD's desktop Llano APU, we wanted to chime in with some thoughts on Llano's desktop performance.
AMD isn't ready to disclose pricing or the entire product matrix for Llano on the desktop, but what we do have is the high-end desktop Llano SKU: AMD's A8-3850.
The 3850 has four cores running at 2.9GHz and doesn't support Turbo Core. On the GPU side it has the full Radeon HD 6550D configuration with 400 shader processors running at 600MHz.
Sandy Bridge's GPU performance is the target, but how much better will AMD do on the desktop? Let's find out.
It's been a while since we've discussed AMD motherboards at AnandTech—over the next few months, I am hoping to bring them back. To start, we have our first Desktop Llano product on the test bed—the ASRock A75 Extreme6.
Unfortunately, what I've been testing is still 'a work in progress'—so there are issues with the BIOS and design still to be decided. For now I'll let you know what I've found, in terms of performance and design. But when the full board comes my way with release information, I'll post a full review.
The desktop Llano series is the mainstream jewel in the AMD calendar. As Anand has discussed, the Fusion APU architecture of AMDs plan is split between the Brazos platform (with Ontario and Zacate) of sub 18W processors with Bobcat cores, and the Lynx platform (with Llano) for 25-100W processors. The former has 1-2 Bobcat cores, whereas with Llano we're dealing with 2-4 K10 cores.
In terms of motherboard design, the Llano processors absorb any form of Northbridge, and the motherboard will use a series of 'Fusion Controller Hubs', codename Hudson. The desktop version will use the Hudson-D series Fusion Controller Hubs, with the A75 Desktop 'Lynx' models under the Hudson-D3 header. The main selling points will be the six native SATA 6 Gbps ports and the four native USB 3.0 ports.
The Lynx platform comes up with some interesting points: hybrid CrossfireX with any 6-series GPU and the APU, native USB 3.0 and SATA 6 Gbps, and dual channel DDR3-1866 native support. Here's some comparisons with P67:
Codename Lynx (Desktop) Sandy Bridge
SATA 6 Gbps + 3 Gbps 6 + 0 2 + 4
Memory Support DDR3-1866 DDR3-1333 / 2133 OC
PCIe 16x or 8x/8x 16x or 8x/8x
RAID 0,1,10 0,1,5,10
USB 3.0 + 2.0 + 1.1 4 + 10 + 2 0 + 14 + 0
Display Output VGA + 1 dedicated /
4 shared (HDMI/DVI/DP) from APU VGA + 3 HDMI/DVI/DP
FIS-Based Switching No Yes
Overclocking Clock Multiplier
For displays, two four-lane interfaces are dedicated for Display Port 1.1, DVI and HDMI—but various combinations aren't possible:
AMD are keen to point out the power consumption curves generated by the gating of the processor and system, depending on various sleep states—citing a one second recovery from S3.
But alas, most of the hype regarding Fusion and Llano is CPU based. In terms of the motherboard, it's up to the designers to get creative, so let's take a look at the ASRock A75 Extreme6.
An Israeli app is bringing together rivals Facebook and Google+. Google+Facebook creates a view of your newsfeed from Facebook within Google+, but some are warning about security flaws in the app's code that may leave your system vulnerable. Experts say the concept of Google+Facebook shows social media users want the best of both worlds.
Want to try out Google+, the newest social networking site, without giving up your Facebook account? You guessed it. There's an app for that.
"Google+Facebook" was created by Crossrider, an Israel-based startup that boasts that it took about a day to create the app, downloadable from its web site for users of Google's Chrome and Mozilla browser. It creates a view of your newsfeed from Facebook within Google+. About 100,000 people have already checked out the app so far. But a founder of the company, Koby Menachemi, said the product is not perfect, and some observers agree, even saying it may harm your computer Relevant Products/Services.
A commenter on the social news site Reddit, RogueDarkJedi, caused a stir with a post detailing what he said were security Relevant Products/Services flaws in the app's code and warning that downloading the program may be essentially adding malware.
Among other concerns, RogueDarkJedi noted that, "The API makes multiple references to a premium service Relevant Products/Services. What this means is that if the author of the plugin fails to pay the service money, Crossrider can force all users of the plugin to install additional crap. This is a forced change that you cannot opt-out of."
Menachemi responded with a point-by point rebuttal, noting that "Crossrider DOES NOT install any extensions other than the specific extension the user has downloaded and confirmed to install!"
Google+ already has an estimated 10 million users, gaining a better public reception than Google Buzz, which was launched in February, 2010, and widely panned because it automatically added and linked Gmail users, raising privacy concerns.
In an interview with Reuters, Menachemi described his app as "a site within a site ... If users want a feature to post updates on both networks, we will. If they want to comment on their Facebook screen, we will do it."
Charles King, principal analyst at Pund-IT Relevant Products/Services, said Google+Facebook shows that social media users want the best of both worlds.
"I think its popularity is interesting and reinforces the differences between Facebook & Google+," said King. "If they were essentially equivalent, I doubt Google+ would be enjoying its flush or initial interest. Clearly a significant number of users want to easily move between the two sites."
Never Say Never
Does Google+ stand a chance over Facebook?
"In IT, I never say never," said King. "Plenty of high flyers have blown it by losing sight of what they do best and their customers want most. Plenty of non-entities went from 0 to 100 mph faster than anyone expected. Google's blown it numerous times, but the company still dominates its core markets. I'm not ready to count them out."
The app also shows the recent impact of Israeli tech startups. Another firm from the Jewish state that caused a recent stir is Tawkon, which makes an app that estimates radiation from cell phones.
"Many major vendors have research facilities in Israel, and numerous Israeli start-ups have succeeded," said King.
Tuesday, July 12, 2011
I remember standing in the audience of Samsung's CTIA press conference as it announced, for the first time ever, pricing and availability of its unreleased Galaxy Tab 10.1 and 8.9 before shipping.
The smartphone (and early tablet) industries have gone this long without having to really compete based on price, mostly because in North America the carriers subsidize much of the cost. If every device costs $199 under contract, why get carried away with details like how much it actually costs?
The Galaxy Tab however was playing in a different space. While Apple ultimately caved to the pressures of carrier subsidies with the iPhone, the iPad remains completely unsubsidized and its followers buy it by the millions. The magical price point is $499 and it was at Samsung's CTIA press conference that it announced it would be matching Apple's $499 price point, and even dropping slightly below it for the 8.9-inch version.
At the time it seemed like a bold move, enough to give Honeycomb the fighting chance it needed. The Galaxy Tab would be thinner and lighter than the iPad 2 but competitively priced as well. This wouldn't be another Xoom.
Samsung Galaxy Tab 10.1 (top) vs. ASUS Eee Pad Transformer (bottom)
Then ASUS showed up. At $399, the Eee Pad Transformer not only offered a different usage model to the iPad and Galaxy Tab, it brought a lower price tag as well. Availability has been slim thanks to component shortages, but with the Eee Pad selling for $399 the Galaxy Tab at $499 all of the sudden seems overpriced.
2011 Tablet Comparison
Apple iPad 2 ASUS Eee Pad Transformer Motorola Xoom WiFi Samsung Galaxy Tab 10.1
SoC Apple A5 (Dual ARM Cortex A9 @ 1GHz) NVIDIA Tegra 2 (Dual ARM Cortex A9 @ 1GHz) NVIDIA Tegra 2 (Dual ARM Cortex A9 @ 1GHz) NVIDIA Tegra 2 (Dual ARM Cortex A9 @ 1GHz)
GPU PowerVR SGX 543MP2 NVIDIA GeForce NVIDIA GeForce NVIDIA GeForce
RAM 512MB 1GB 1GB 1GB
Display 1024 x 768 IPS 1280 x 800 IPS 1280 x 800 1280 x 800 PLS
NAND 16GB 16GB 32GB 16GB
Dimensions 241.2mm x 185.7mm x 8.8mm 271mm x 175mm x 12.95mm 249.1mm x 167.8mm x 12.9mm 256.6 x 172.9 x 8.6mm
Weight 601g 695g 730g 565g
Price $499 $399 $599 $499
Based on specs alone you'd be right. Samsung's Galaxy Tab 10.1 has the same NVIDIA Tegra 2 tablet SoC inside,
1GB of LPDDR2 and 16GB of NAND on-board. You get a 10.1-inch 1280 x 800 PLS display and 802.11n WiFi support. It's worth pointing out that we're now well into the month of June and NVIDIA continues to be the only SoC vendor shipping in Honeycomb tablets. Samsung originally had plans to ship its own Exynos SoC in the Galaxy Tab but Tegra 2 remains the port of choice for all Honeycomb vendors at this point. Whether or not NVIDIA can win twice in a row with Ice Cream Sandwich later this year remains to be seen.
Samsung Galaxy Tab 10.1 (left) vs. ASUS Eee Pad Transformer (Right)
Where Samsung gives you something more for your money is in build quality and form factor. While Eee Pad Transformer feels surprisingly good for a cost reduced tablet, it doesn't feel nearly as slim or portable as the Galaxy Tab 10.1. It's no wonder Samsung went back to the drawing board on this one, the result is something that in many ways feels better than the iPad 2.
Apple still gets the nod in terms of quality of materials. The aluminum back of the iPad 2 is unbeatable. The Galaxy Tab 10.1 however feels lighter, a bit more rugged (I'm less concerned about scratching plastic than I am marring aluminum) and a little more comfortable to hold as a result. Against the Eee Pad there's no competition. I can live with the Eee Pad, but I much prefer the feel of the Galaxy Tab 10.1. The new Tab just feels like a device from this year - a compliment that, until now, I had only given to Apple.
The Galaxy Tab 10.1 measures just 8.6mm thick, 0.2mm thinner than the iPad 2. To be honest you really can't tell the difference, both devices feel thin. Ever after holding them back to back it's near impossible to tell that Samsung has built a thinner device. The most tangible difference in feel is the weight, not just in overall mass but in terms of weight distribution. The Galaxy Tab seems to carry the weight a bit better than the iPad, a bit more evenly.
Samsung Galaxy Tab 10.1 (left) vs. Apple iPad 2 (right)
Now is as good a time as any to point out that although Samsung calls this the Galaxy Tab 10.1, it's really the new Galaxy Tab 10.1. Originally Samsung announced a much thicker version at Mobile World Congress, a month prior to the CTIA announcement. With the much thinner iPad 2 hitting the market after the original 10.1 announcement, Samsung scrapped plans for the original and unveiled the thinner 8.6mm version as the new Galaxy Tab. The original Galaxy Tab 10.1 is now known as the Galaxy Tab 10.1v.
The popularity of Intel's HD Graphics amongst HTPC enthusiasts and the success of the AMD APUs seem to indicate that the days of the discrete HTPC GPU are numbered. However, for those with legacy systems, a discrete HTPC GPU will probably be the only way to enable hardware accelerated HD playback. In the meanwhile, discrete HTPC GPUs also aim to offer more video post processing capabilities.
In this context, both AMD and NVIDIA have been serving the market with their low end GPUs. These GPUs are preferable for HTPC scenarios due to their low power consumption and ability to be passively cooled. Today, we will be taking a look at four GPUs for which passively cooled solutions exist in the market. From AMD's side, we have the 6450 and 6570, while the GT 430 and GT 520 make up the numbers from the NVIDIA side.
Gaming benchmarks are not of much interest to the HTPC user interested in a passively cooled solution. Instead of focusing on that aspect, we will evaluate factors relevant to the AV experience. After taking a look at the paper specifications of the candidates, we will describe our evaluation testbed.
We will start off the hands-on evaluation with a presentation of the HQV benchmarks. This provides the first differentiating factor.
While almost all cards (including the integrated graphics on CPUs) are able to playback HD videos with some sort of acceleration, videophiles are more demanding. They want to customize the display refresh rate to match the source frame rate of the video being played. Casual HTPC users may not recognize the subtle issues created by mismatched refresh rates. However, improper deinterlacing may lead to highly noticeable issues. We will devote a couple of sections to see how the cards handle custom refresh rates and fare at deinterlacing.
After this, we will proceed to identify a benchmark for evaluating HTPC GPUs. This benchmark gives us an idea of how fast the GPUs can decode the supported codecs, and whether faster decoding implies more time for post processing. We will see one of the cards having insane decoding speeds, and try to find out why.
Over the last few months, we have also been keeping track of some exciting open source software in the HTPC area. Aiming to simplify the player setup and also take advantage of as many features of your GPU as possible, we believe these are very close to being ready for prime time. We will have a couple of sections covering the setup and usage of these tools.
Without further ado, let us go forward and take a look at the contenders.