Gigabyte P35X V3 Laptop Specs Review

Gigabyte P35X Laptop would be the frumpy best friend that's desperately in need of a makeover. The black aluminum chassis is about as bland as they come, with only a shiny chrome Gigabyte logo to add any semblance of panache on the outside. Inside isn't much better; the oval-shaped speaker grilles and silver power button are the only things that stand out. Gigabyte P35X's rounded corners are nice, but the overall presentation is something I'd expect from a budget notebook, not something that costs more than $2,000. For the money, I'd much rather have the beautiful HP Omen and all its alluring angles. At 5.6 pounds and 15.2 x 10.6 x 0.82 inches, the Gigabyte is a tad chunky for a 15-inch laptop, especially when compared to the HP Omen (4.8 pounds, 15.1 x 9.7 x 0.78 inches). However, the P35X has an optical drive and beefier specs.

laptop review, laptop reviews, best laptop, best laptops, laptops, laptop deals, computer, best laptop,
The display also held up well during gaming. I enjoyed crystal-blue skies and lush green vegetation as I made my way through Kyrat in Far Cry 4. I was impressed with the level of detail, which allowed me to see the individual hairs on a clouded leopard's hide as I made quick work of skinning it. The laptop performed well on the sRGB Gamut test, which measures color reproduction, scoring 105.2 percent. That's well above the 84 percent mainstream average. The HP Omen was a respectable 98 percent, while the MSI Dominator Pro delivered 89 percent. When tested for brightness, the P35X dazzled at 338 nits, destroying the 238-nit average, not to mention the Omen (269) and the Dominator Pro (255).


Specification Gigabyte P35X V3 :
CPU : 2.5-GHz Intel Core i7-4710HQ.
Operating : System Windows 8.1.
RAM : 16GB.
Hard Drive Size : Dual 128 GB mSATA SSD.
Hard Drive Type : Dual mSATA SSD.
Hard Drive Size : 1TB.
Display Size : 15.6.
Resolution : 2880 x 1620.
Graphics Card : Nvidia GeForce GTX 980M/Intel HD Graphics 4600
Video Memory : 8GB.
Wi-Fi : 802.11ac/b/g/n.
Ports : Gigabit Ethernet,  USB 2.0 , USB 3.0,  USB 2.0, Mini DisplayPort.USB Ports : 4.
Size : 15.2 x 10.6 x 0.82 inches.
Weight : 5.6 pounds.
Warranty :  two-year warranty.

Samsung Galaxy Tab S 8.4 Expected To Be The First Product Review

SRI International announced an exclusive license deal with Samsung to use its IOM (Iris on the Move) technology to Samsung for use in its mobile devices

The first product among the list is the Samsung Galaxy Tab S 8.4 tablet which will come with this technology built-in to make use of the iris recognition, hinted by SRI in their announcement.

samsung tab 4, galaxy tab 4, samsung galaxy, samsung galaxy s4, samsung galaxy 3, samsung laptop, samsung tablet, laptop, laptops, best laptop brand, samsung phone

SRI International today announced an exclusive license of Iris on the Move® (IOM) technologies to Samsung for use in Samsung mobile products. Additionally SRI has entered into a supply agreement to start production and sales of the IOM technology-embedded Samsung mobile products for B2B applications. The initial product for this supply agreement will be a customized Samsung Galaxy Tab Pro 8.4 tablet with a built-in IOM iris module. Galaxy Tab S line up was introduced back at Mobile World Congress 2014, but we didn’t see its successor this year. The Mobile World Congress 2015 is history, but rather than the new tablets, Samsung introduced its latest flagship devices, the Galaxy S6 and Galaxy S6 Edge.


SRI mentioned the product will be unveiled at the “SIA New Product Showcase at ISC West 2015 (the largest security industry trade show in the U.S.) and offered worldwide through SRI partners and resellers.

We don’t know much about the upcoming product from the folks at Samsung featuring this new technology, but we’ll update you as soon as we get any other information about the upcoming Galaxy Tab S 8.4. Source : Sri

Asus C300M Specs Best Laptop Review

The Asus C300M Laptops is such a device. With its dual-core Intel N2380 CPU, it has low-end specs for a laptop but more processing power than most other Chromebooks. It's not the first to market with more screen for your money. In fact there are devices that have broken through the 1080p display barrier. The C300M doesn't – but the design, specs, performance and price-point still make for a machine that's worth a look. But a word of warning first, you may need to put on your sunglasses.

Design With a slim profile boasting a height of 20.3mm from lid to base, the fanless C300M has Ultrabook styling and Chromebook mobility. At 339mm wide and 230mm deep, the 13.3-inch screen is part of a trend towards larger displays. The first wave of 11.6-inch Chromebooks, like the now classic Samsung Series 3, are beginning to look like toys in comparison.

The Chromebooks are usually a bit plasticky, and that's fine when you're paying £200 (around $250 in the US, in the case of this model) or thereabouts for a computer you can chuck in a rucksack. The Asus C300M just about falls into that ballpark, with a lid barely 5mm thick and a casing that's distinctly springy in places. The build is solid though, and it looks reassuringly like a standard notebook until you pick it up.

best gaming laptop brands, best toshiba laptops, toshiba notebooks, toshiba laptops,best toshiba laptop, top ten best laptops, best laptop review,

The rounded corners, common to Chromebooks, add a bit of softness to the overall design. If we were to guess the brand decisions behind it, we'd go with the student market. It's a funky rather than functional looking wrapper for a piece of kit that's good for web research, media consumption and essay writing.

What you lose in brawn you gain in portability, it weighs just 1.4kg (3.08lb). That just about enables you to carry it around open, cup of tea in one hand and Chromebook in the other.

Until recently it was en vogue to disguise the construction of Chromebooks with a grey or black finish, as though pretending to be metal or carbon would somehow atone for the reliance on synthetic polymers. Not the C300M. It's loud and proud, with off-primary finishes that include blue, yellow, red and, um, black. We road-tested the red edition which, truth be told, is a bit closer to blood orange.

It's at this point that we would be making a joke like "it's too orangey for Chrome" if we hadn't enjoyed using the C300M. But we did enjoy using it, very much. The candy coated colouring added something to that.

Spesificatition :
CPU: 2.16GHz dual-core Intel Celeron N2830 processor
Graphics: Intel HD Graphics
RAM: 2GB DDR3
Screen: 13.3-inch 16:9 HD (1366 x 768)
Storage: 32GB SSD
Ports: 1 x USB 3.0, 1 x USB 2.0, HDMI, SD card slot, headphone/mic jack
Connectivity: Integrated 802.11a/b/g/n/ac, Bluetooth 4.0
Camera: HD webcam
Weight: 1.4kg (3.08lb)
Size: 33.9 x 23.0 x 2.03 cm (W x D x H)

Nividia GeForce GTX Review

The NVIDIA GeForce GTX is back with another Titan, and this time they are looking to recapture a lot of the magic of the original Titan. First teased back at GDC 2015 in an Epic Unreal Engine session, and used to drive more than a couple of demos at the show, the GTX Titan X gives NVIDIA’s flagship video card line the Maxwell treatment, bringing with it all of the new features and sizable performance gains that we saw from Maxwell last year with the GTX 980. To be sure, this isn’t a reprise of the original Titan – there are some important differences that make the new Titan not the same kind of prosumer card the original was – but from a performance standpoint NVIDIA is looking to make the GTX Titan X as memorable as the original. Which is to say that it’s by far the fastest single-GPU card on the market once again.

NVIDIA has assembled a new Maxwell GPU, GM200 (aka Big Maxwell). We’ll dive into GM200 in detail a bit later, but from a high-level standpoint GM200 is the GM204 taken to its logical extreme. It’s bigger, faster, and yes, more power hungry than GM204 before it. In fact at 8 billion transistors occupying 601mm2 it’s NVIDIA’s largest GPU ever. And for the first time in quite some time, virtually every last millimeter is dedicated to graphics performance, which coupled with Maxwell’s performance efficiency makes it a formidable foe.

Diving into the specs, GM200 can for most intents and purposes be considered a GM204 + 50%. It has 50% more CUDA cores, 50% more memory bandwidth, 50% more ROPs, and almost 50% more die size. Packing a fully enabled version of GM200, this gives the GTX Titan X 3072 CUDA cores and 192 texture units(spread over 24 SMMs), paired with 96 ROPs. Meanwhile considering that even the GM204-backed GTX 980 could outperform the GK110-backed GTX Titans and GTX 780 Ti thanks to Maxwell’s architectural improvements – 1 Maxwell CUDA core is quite a bit more capable than Kepler in practice, as we’ve seen – GTX Titan X is well geared to shoot well past the previous Titans and the GTX 980.

graphic cards for laptops, nvidia laptop graphics card, buy nvidia graphics card, nvidia gpu, what are the best laptops for gaming, the best laptop brands, best laptops brands, best gaming laptop brands

 the GTX Titan Black this is one of the few areas where GTX Titan X doesn’t have an advantage in raw specifications – there’s really nowhere to go until HBM is ready – however in this case numbers can be deceptive as NVIDIA has heavily invested in memory compression for Maxwell to get more out of the 336GB/sec of memory bandwidth they have available. The 12GB of VRAM on the other hand continues NVIDIA’s trend of equipping Titan cards with as much VRAM as they can handle, and should ensure that the GTX Titan X has VRAM to spare for years to come. Meanwhile sitting between the GPU’s functional units and the memory bus is a relatively massive 3MB of L2 cache, retaining the same 32K:1 cache:ROP ratio of Maxwell 2 and giving the GPU more cache than ever before to try to keep memory operations off of the memory bus.

As for clockspeeds, as with the rest of the Maxwell lineup GTX Titan X is getting a solid clockspeed bump from its Kepler predecessor. The base clockspeed is up to 1Ghz (reported as 1002MHz by NVIDIA’s tools) while the boost clock is 1075MHz. This is roughly 100MHz (~10%) ahead of the GTX Titan Black and will further push the GTX Titan X ahead. However as is common with larger GPUs, NVIDIA has backed off on clockspeeds a bit compared to the smaller GM204, so GTX Titan X won’t clock quite as high as GTX 980 and the overall performance difference on paper is closer to 33% when comparing boost clocks.

Power consumption on the other hand is right where we’d expect it to be for a Titan class card. NVIDIA’s official TDP for GTX Titan X is 250W, the same as the previous single-GPU Titan cards (and other consumer GK110 cards). Like the original GTX Titan, expect GTX Titan X to spend a fair bit of its time TDP-bound; 250W is generous – a 51% increase over GTX 980 – but then again so is the number of transistors that need to be driven. Overall this puts GTX Titan X on the high side of the power consumption curve (just like GTX Titan before it), but it’s the price for that level of performance. Practically speaking 250W is something of a sweet spot for NVIDIA, as they know how to efficiently dissipate that much heat and it ensures GTX Titan X is a drop-in replacement for GTX Titan/780 in any existing system designs. Via : Anandtech

Microsoft’s Cortana voice assistant heading to Android, iOS Report Review

Apple’s phones and tablets have Siri. Google Android has “OK Google.” And Windows Phone has Cortana. But the lines might get blurry soon.

Cortana is Microsoft’s voice assistant software which lets you search the web, set reminders, initiate phone calls, and much more. While the service was first introduced for Windows Phone, it will also be available to desktop and tablet users when Microsoft releases Windows 10.

best smartphone 2015, samrtphone review, samsung, lenovo, tablet, android
It’s not clear if you’ll be able to use all of Cortana’s features on an iPhone or Android device, since Microsoft’s personal assistant won’t be built into the operating system and will instead run as an app… which means it won’t have full system access and will have the same limitations you’d expect from any third-party app.

According to Reuters, Microsoft is also working on an artificial intelligence project called “Einstein” which is designed to make Cortana more powerful. Among other things, it’ll be able to “read and understand email” and other items in order to provide you information you might need before you even ask for it. For example it could remind you when it’s time to leave or the airport based on an email confirmation for a flight reservation.

Google Now already does something similar for Android users, while Apple’s Siri is designed mostly to respond to requests rather than attempting to anticipate your needs.

Reuters reports that Microsoft plans to launch Cortana apps for Android and iOS.

ASRock X99 Extreme-11 - Eighteen SATA Ports with Haswell - E Review

Featuring the six SATA ports from the PCH and eight from the bundled LSI 3008 onboard controller. Our sample back then used eight PCIe lanes for the controller and achieved 4 GBps maximum read and write sequential speeds when using an eight drive SSD RAID-0 SF-2281 array. Between the X79 and the X99 model came the Z87 Extreme11/ac which used the same LSI controller but bundled it with a port multiplier, giving sixteen SAS/SATA ports plus the six from the chipset for 22 total. When we come to the X99 Extreme11 in this review, we get the same 3008 controller without the multiplier) which adds eight ports to the ten from the PCH, giving eighteen in total.


One of the criticisms from the range is the lack of useful hardware RAID modes with the LSI 3008. It only gives RAID 0 and 1 (also 1E and 10) with no scope for RAID 5/6. This is partly because the controller comes without any cache (or albeit a very small one) which cannot help with managing such an array. ASRock's line on this is partly due to controller cost and complexity of implementation, suggesting that users who require these modes should use a software RAID solution. Users who want a hardware solution will have to buy a controller card that supports it, and ASRock is keen to point out that the Extreme11 range has plenty of PCIe bandwidth to handle it.

sata, atiradion, gigabyte, asus,
The amount of PCIe bandwidth brings up another interesting element to the Extreme11 range. ASRock feels that their high end motherboard range must support four-way GPU configurations, preferably in x16/16/16/16 lane allocation. In order to do this, along with having enough lanes for the LSI 3008 controller that needs eight, for the X99 Extreme11 there are two PLX8747 PCIe switches on board. We covered the PLX8747 during its prominent use during Z77, but a base summary is that due in part to its FIFO buffer it can multiplex 8 or 16 PCIe lanes into 32. Thus for the X99 Extreme11 and its dual PLX8747 arrangement, each PLX switch takes 16 lanes from the CPU to give two PCIe 3.0 x16 slots, totaling four PCIe 3.0 x16 slots overall. The final eight lanes from the CPU go to the LSI controller, accounting for 40 lanes from the processor. (28 lane CPUs behave a little differently, see the review below.)

As you might imagine, two PLX8747 switches and an LSI controller onboard does not come cheap, and that is why the Extreme11 is one of the most expensive X99 Motherboards on the market at $630+, only to be bested in this competition by the ASRock X99 WS-E/10G which comes with a dual port 10GBase-T controller for $670. Aside from the four PCIe 3.0 x16 slots and 18 SATA ports, the Extreme11 also comes with support for 128GB of RDIMMs, LGA2011-3 Xeon compatibility, dual Intel network ports, upgraded audio and dual PCIe 3.0 x4 M.2 slots. The market ASRock aims for with this board needs high storage and compute requirements in their workstation - typically with these builds the motherboard cost is not that important, but the feature set is. That makes the X99 Extreme11 an entertaining product in an interesting market segment.

With the extra SATA ports and controller chips onboard, the Extreme11 expands into the EATX form factor, which means an extra inch or so horizontally for motherboard dimensions. Aside from the big block of SATA ports, nothing looks untoward on the board, giving an extended heatsink around the power delivery down to the chipset heatsink which has an added fan to deal with the two PLX8747 chips and the LSI 3008 controller.

The socket area is fairly crammed up to Intel’s specifications, with ASRock’s Super Alloy based power delivery packing in twelve phases in an example of over engineering. The DRAM slots are color coded for the black slots to be occupied first. Within the socket area there are four fan headers to use – two CPU headers in the top right (4-pin and 3-pin), a 3-pin header just below the bottom left of the socket (above the PCIe slot) and another 3-pin near the top of the SATA ports. The other two fan headers at the bottom of the board are one 4-pin and another 3-pin, with the final fan header provided for the chipset fan. This can be disabled if required by removing the cable.

The bottom right of the motherboard next to the SATA ports and under the chipset heatsink hides the important and costly controller chips. Combining the two PLX8747 on the left, the LSI RAID controller and the chipset comes north of 30W in total for power use, hence the extra fan on the chipset.

Each PLX8747 PCIe switch can take in eight or sixteen PCIe 2.0 or PCIe 3.0 lanes, then by using a combination of a FIFO buffer and multiplexing output 32 PCIe 3.0 lanes. Sometimes this sounds like magic, but it is best to think of it as a switching FPGA – between the PCIe slots, we have full PCIe 3.0 x16 bandwidth, but if we go up the pipe back to the CPU, we are still limited by that 8/16 lane input. The benefit of the FIFO buffer is a fill twice/pour once scenario, coalescing commands and shooting them up the data path together rather than performing a one in/one out. In our previous testing the PLX8747 gave a sub 1% performance deficit in gaming, but aids compute users that need inter-GPU bandwidth. It also surpasses the SLI fixed limitation of needing eight PCIe lanes, ensuring that the NVIDIA configurations are happy.

The LSI 3008 is a little long in the tooth having been on the X79 and Z87 Extreme11 products, but it does what ASRock wants it to do – provide extra storage ports for those that need it. In order to get a case that can support 18 drives is another matter – we often see companies like Lian Li do them at Computex, and some cost as much as the motherboard. The next cost is all the drives, but I probably would not say no to an 18*6 TB system. The lack of RAID 5/6 for redundancy offerings is still a limitation, as is the lack of a cache. Moving up the LSI stack to a controller that does offer RAID 5/6 would add further cost to the product, and at this point ASRock has little competition in this space.

On the back of the motherboard is this interesting IC from Everspin, which turns out to be 1MB of cache for the LSI controller. There is scope for ASRock to put extra cache on the motherboard, allowing for higher up RAID controllers, but the cost/competition scenario falls into play again.

The final part of the RAID controller is this MXIC chip, which looks to be a 128Mbit flash memory IC with 110ns latency.

Aside from the fancier features, the motherboard has two USB 3.0 headers above the SATA ports (both from the PCH), power/reset buttons, a two digit debug display, two BIOS chips with a selector switch, two USB 2.0 headers, a COM header, and the usual front panel/audio headers. Bang in the middle of the board, between the PCIe slots and the DRAM slots, there is a 4-pin molex to provide extra power to the PCIe slots when multiple hungry GPUs are in play. There is also another power connector below the PCIe slots, but ASRock has told us that only one is needed to be occupied at any time. I have mentioned to ASRock that the molex connector is falling out of favor with PSU manufacturers and very few users actually need one in 2015, as well as the fact that these connectors are both in fairly awkward places. The response was that the molex is the easiest to apply (compared to SATA power or 6-pin PCIe power), and the one in the middle of the board is for users that have smaller cases. I have a feeling that ASRock won’t shift much on this design philosophy unless they develop a custom connector.

The PCIe slots give x16/x16/x16/x16, with the middle slot using eight PCIe 3.0 lanes when in use causing the slot underneath to split causing an x8/x8 arrangement. With sufficiently sized cards, this gives five cards in total possible. Normally we see the potential for a seven card setup, but ASRock has decided to implement two PCIe 3.0 x4 M.2 slots in-between a couple of the PCIe slots. The bandwidth for these slots comes from the CPUs PCIe lanes, and thus do not get hardware RAID capabilities. However, given the PM951 is about to be released, two of them in a software RAID for 2800 MBps+ sequentials along with an 18*6 TB setup would be a super storage platform.

For users wanting to purchase the 28-lane i7-5820K for this motherboard, the PCIe allocation is a little harder to explain. The CPU gives 8 lanes each to the PLX controllers, giving a full x16/x16/x16/x16 solution still applies, with another 8 lanes for the LSI controller. The first M.2 x4 port gets the last four lanes and the second M.2 slot is disabled.

The rear panel gives four USB 2.0 ports, a combination PS/2 port, a Clear CMOS button, two eSATA ports, two USB 3.0 from the PCH, two USB 3.0 from an ASMedia controller, an Intel I211-AT network port, an Intel I218-V network port and audio jacks from the Realtek ALC1150 audio codec.