News


Project Tango Demoed with Qualcomm at SIGGRAPH 2016

Project Tango Demoed with Qualcomm at SIGGRAPH 2016

Project Tango at this point is probably not new to anyone reading this as we’ve discussed it before, but in the past few years Google has been hard at work making positional tracking and localization into a consumer-ready application. While there was an early tablet available with an Nvidia Tegra SoC inside, there were a number of issues on both hardware and software. As the Tegra SoC was not really designed for workloads that Project Tango puts on a mobile device, much of the work was done on the GPU and CPU, with offloading to dedicated coprocessors like ST-M’s Cortex M3 MCUs for sensor hub and timestamp functionality, computer vision accelerators like a VPU from Movidius, and other chips that ultimately increased BOM and board area requirements.

At SIGGRAPH today Google recapped some of this progress that we’ve seen at Google I/O as far as algorithms go and really polishing the sensor fusion, feature tracking, modeling, texturing, and motion tracking aspects of Tango. Anyone that has tried to do some research into how well smartphones can act as inertial navigation devices will probably know that it’s basically impossible to avoid massive integration error that makes the device require constant location updates from an outside source to avoid drifting.

With Tango, the strategy taken to avoid this problem works at multiple levels. At a high level, sensor fusion is used to combine both camera data and inertial data to cancel out noise from both systems. If you traverse the camera tree, the combination of feature tracking on the cameras as well as depth sensing on the depth sensing camera helps with visualizing the environment for both mapping and augmented reality applications. The combination of a traditional camera and a fisheye camera also allows for a sort of distortion correction and additional sanity checks for depth by using parallax, although if you’ve ever tried dual lens solutions on a phone you can probably guess that this distance figure isn’t accurate enough to rely completely on. These are hard engineering problems, so it hasn’t been until recently that we’ve actually seen programs that can do all of these things reliably. Google disclosed that without using local anchor points in memory that the system drifts at a rate of about 1 meter every 100 meters traversed, so if you never return to previously mapped areas the device will eventually have a non-trivial amount of error. However, if you return to previously mapped areas the algorithms used in Tango will be able to reset its location tracking and eliminate accumulated error.

With the Lenovo Phab 2 Pro, Tango is finally coming to fruition in a consumer-facing way. Google has integrated Tango APIs into Android for the Nougat release this fall. Of course, while software is one part of the equation, it’s going to be very difficult to justify supporting Tango capabilities if it needs all of the previously mentioned coprocessors in addition to the depth sensing camera and fisheye camera sensors.

In order to enable Tango in a way that doesn’t require cutting into battery size or general power efficiency, Qualcomm has been working with Google to make the Tango API run on the Snapdragon SoC in its entirety rather than on dedicated coprocessors. While Snapdragon SoCs generally have a global synchronous clock, Tango really pushes the use of this to its full extent by using this clock on multiple sensors to enable the previously mentioned sensor fusion. In addition to this, processing is done on the Snapdragon 652 or 820’s ISP and Hexagon DSP, as well as the integrated sensor hub with low power island. The end result is that there enabling the Tango APIs requires no processing on the GPU and relatively minimal processing on the CPU such that Tango-enabled applications can run without hitting thermal limits and allowing for more advanced applications using Tango APIs. Qualcomm claimed that less than 10% of cycles on the S652 and S820 are used on the CPU and less than 35% of cycles on the DSP are needed as well. Qualcomm noted in further discussion that the use of Hexagon Vector Extensions would further cut down on CPU usage, and that much of the current CPU usage was on the NEON vector units.

To see how all of this translates Qualcomm showed off the Lenovo Phab 2 Pro with some preloaded demo apps like a home improvement application from Lowe’s which supports size measurements and live preview of appliances in the home with fairly high level of detail. The quality of the augmented reality visualization is actually shockingly good to the extent that the device can differentiate between walls and the floor so you can’t just stick random things in random places, and the placement of objects is static enough that there’s no strange floatiness that often seems to accompany augmented reality. Objects are redrawn fast enough that camera motion results in seamless and fluid motion of virtual objects, and in general I found it difficult to see any real issues in execution.

While Project Tango still seemed to have some bugs to iron out and some features or polish to add, it looks like as it is now the ecosystem has progressed to the point where Tango API features are basically ready for consumers. The environment tracking for true six degree of freedom movement surely has implications for mobile VR headsets as well, and given that only two extra cameras are needed to enable Tango API features it shouldn’t be that difficult for high-end devices to integrate such features, although due to the size of these sensors it may be more targeted towards phablets than regular smartphones.

Project Tango Demoed with Qualcomm at SIGGRAPH 2016

Project Tango Demoed with Qualcomm at SIGGRAPH 2016

Project Tango at this point is probably not new to anyone reading this as we’ve discussed it before, but in the past few years Google has been hard at work making positional tracking and localization into a consumer-ready application. While there was an early tablet available with an Nvidia Tegra SoC inside, there were a number of issues on both hardware and software. As the Tegra SoC was not really designed for workloads that Project Tango puts on a mobile device, much of the work was done on the GPU and CPU, with offloading to dedicated coprocessors like ST-M’s Cortex M3 MCUs for sensor hub and timestamp functionality, computer vision accelerators like a VPU from Movidius, and other chips that ultimately increased BOM and board area requirements.

At SIGGRAPH today Google recapped some of this progress that we’ve seen at Google I/O as far as algorithms go and really polishing the sensor fusion, feature tracking, modeling, texturing, and motion tracking aspects of Tango. Anyone that has tried to do some research into how well smartphones can act as inertial navigation devices will probably know that it’s basically impossible to avoid massive integration error that makes the device require constant location updates from an outside source to avoid drifting.

With Tango, the strategy taken to avoid this problem works at multiple levels. At a high level, sensor fusion is used to combine both camera data and inertial data to cancel out noise from both systems. If you traverse the camera tree, the combination of feature tracking on the cameras as well as depth sensing on the depth sensing camera helps with visualizing the environment for both mapping and augmented reality applications. The combination of a traditional camera and a fisheye camera also allows for a sort of distortion correction and additional sanity checks for depth by using parallax, although if you’ve ever tried dual lens solutions on a phone you can probably guess that this distance figure isn’t accurate enough to rely completely on. These are hard engineering problems, so it hasn’t been until recently that we’ve actually seen programs that can do all of these things reliably. Google disclosed that without using local anchor points in memory that the system drifts at a rate of about 1 meter every 100 meters traversed, so if you never return to previously mapped areas the device will eventually have a non-trivial amount of error. However, if you return to previously mapped areas the algorithms used in Tango will be able to reset its location tracking and eliminate accumulated error.

With the Lenovo Phab 2 Pro, Tango is finally coming to fruition in a consumer-facing way. Google has integrated Tango APIs into Android for the Nougat release this fall. Of course, while software is one part of the equation, it’s going to be very difficult to justify supporting Tango capabilities if it needs all of the previously mentioned coprocessors in addition to the depth sensing camera and fisheye camera sensors.

In order to enable Tango in a way that doesn’t require cutting into battery size or general power efficiency, Qualcomm has been working with Google to make the Tango API run on the Snapdragon SoC in its entirety rather than on dedicated coprocessors. While Snapdragon SoCs generally have a global synchronous clock, Tango really pushes the use of this to its full extent by using this clock on multiple sensors to enable the previously mentioned sensor fusion. In addition to this, processing is done on the Snapdragon 652 or 820’s ISP and Hexagon DSP, as well as the integrated sensor hub with low power island. The end result is that there enabling the Tango APIs requires no processing on the GPU and relatively minimal processing on the CPU such that Tango-enabled applications can run without hitting thermal limits and allowing for more advanced applications using Tango APIs. Qualcomm claimed that less than 10% of cycles on the S652 and S820 are used on the CPU and less than 35% of cycles on the DSP are needed as well. Qualcomm noted in further discussion that the use of Hexagon Vector Extensions would further cut down on CPU usage, and that much of the current CPU usage was on the NEON vector units.

To see how all of this translates Qualcomm showed off the Lenovo Phab 2 Pro with some preloaded demo apps like a home improvement application from Lowe’s which supports size measurements and live preview of appliances in the home with fairly high level of detail. The quality of the augmented reality visualization is actually shockingly good to the extent that the device can differentiate between walls and the floor so you can’t just stick random things in random places, and the placement of objects is static enough that there’s no strange floatiness that often seems to accompany augmented reality. Objects are redrawn fast enough that camera motion results in seamless and fluid motion of virtual objects, and in general I found it difficult to see any real issues in execution.

While Project Tango still seemed to have some bugs to iron out and some features or polish to add, it looks like as it is now the ecosystem has progressed to the point where Tango API features are basically ready for consumers. The environment tracking for true six degree of freedom movement surely has implications for mobile VR headsets as well, and given that only two extra cameras are needed to enable Tango API features it shouldn’t be that difficult for high-end devices to integrate such features, although due to the size of these sensors it may be more targeted towards phablets than regular smartphones.

Apple Announces Q3 FY 2016 Results: App Store Up, Hardware Down

Apple Announces Q3 FY 2016 Results: App Store Up, Hardware Down

Today Apple announced their third quarter results for their fiscal year 2016. Much like last quarter, Apple has struggled to maintain the sales pace of the iPhone 6s, compared to the iPhone 6. For the quarter, Apple had revenues of $42.358 billion, which is down 11% from a year ago. Gross margin was $16.106 billion, down from $19.681 billion in Q3 2015, and percentage wise it is 38.0%. Operating income was $10.1 billion, down from $14.1 billion last year, and net income was down almost $3 billion to $7.8 billion. Diluted earnings per share were $1.42, down from $1.85 a year ago. Despite the lower quarter, Apple did beat expectations which has helped their share price in after-hours trading.

Apple Q3 2016 Financial Results (GAAP)
  Q3’2016 Q2’2015 Q3’2015
Revenue (in Billions USD) $42.358 $50.557 $49.605
Gross Margin (in Billions USD) $16.106 $19.921 $19.681
Operating Income (in Billions USD) $10.105 $13.987 $14.473
Net Income (in Billions USD) $7.796 $10.516 $10.677
Margins 38.0% 39.4% 39.7%
Earnings per Share (in USD) $1.42 $1.90 $1.85

Apple announced a dividend of $0.57 per share payable on August 11th to shareholders of record as of August 8th. They also returned over $13 billion during Q3 through share buy-backs and dividends, and they have completed almost $177 billion of their $250 billion capital return program.

iPhone sales are far and away the largest part of the company, and this quarter Apple sold 40.4 million handsets. That is down from the 51.2 million last quarter, and 47.5 million in Q3 2015, meaning iPhone sales were down 15% year-over-year. This resulted in revenue of $24 billion, down 23% from a year ago. It’s certainly a noticeable drop, and it shows just how successful the iPhone 6 was when it launched.

Moving on, iPad sales continued their slow and steady decline. Sales of the tablet were just a hair under ten million for the quarter, which is a drop of 9% year-over-year. Revenue was $4.9 billion, which is up 7%. A year ago, the average selling price of the iPad was $415, but this quarter, average selling price for the iPad rose $85 to $490. Declining sales of the iPad Mini, as well as new sales of the higher priced iPad Pro are certainly the case, but Apple doesn’t break out the numbers for individual models to know just how much each was a factor.

The Mac didn’t fare very well either, with unit sales of 4.25 million, which is down 11% year-over-year. This resulted in revenue of $5.24 billion, down 13%. With basically no Mac refreshes in a long time, they are no longer outperforming the PC market as a whole, which was the case for the last while.

Apple’s “Other Products” includes Apple TV, Apple Watch, Beats, iPods, and accessories, and while none of this is broken down by sub-category, the Other Products as a whole also fell 16% in revenue compared to Q3 2015, with revenues for this quarter of $2.22 billion.

Apple Q3 2016 Device Sales (thousands)
  Q3’2016 Q2’2016 Q3’2015 Seq Change Year/Year Change
iPhone 40,399 51,193 47,534 -21% -15%
iPad 9,950 10,251 10,931 -3% -9%
Mac 4,252 4,034 4,796 +5% -11%

The one segment in which Apple had strong growth was their Services segment. Services grew by 19% compared to Q3 2015, with revenue of $5.976 billion, which is up almost a billion or 19% year-over-year. Q2 2016 revenue was pretty much the same at $5.991 billion, meaning services have once again outpaced both Mac and iPad sales, and now represent the second largest segment at Apple.

Apple Q3 2016 Revenue by Product (billions)
  Q3’2016 Q2’2016 Q3’2015 Revenue for current quarter
iPhone $24.048 $32.857 $31.368 56.8%
iPad $4.876 $4.413 $4.538 11.5%
Mac $5.239 $5.107 $6.030 12.4%
iTunes/Software/Services $5.976 $5.991 $5.028 14.1%
Other Products $2.219 $2.189 $2.641 5.2%

Overall, it’s the second consecutive quarter of revenue loss, and last quarter was the first time that happened since Q1 2003, so Apple is in somewhat unfamiliar territory. Their guidance for next quarter is $45.5 to $47.5 billion, and margins between 37.5% and 38%. That guidance is also for a loss of revenue, since Q4 2015 had the company coming in at $51.5 billion, and 39.9% margins. It will be interesting to see if hardware refreshes in the fall can stop the drop in sales.

Source: Apple Investor Relations

Apple Announces Q3 FY 2016 Results: App Store Up, Hardware Down

Apple Announces Q3 FY 2016 Results: App Store Up, Hardware Down

Today Apple announced their third quarter results for their fiscal year 2016. Much like last quarter, Apple has struggled to maintain the sales pace of the iPhone 6s, compared to the iPhone 6. For the quarter, Apple had revenues of $42.358 billion, which is down 11% from a year ago. Gross margin was $16.106 billion, down from $19.681 billion in Q3 2015, and percentage wise it is 38.0%. Operating income was $10.1 billion, down from $14.1 billion last year, and net income was down almost $3 billion to $7.8 billion. Diluted earnings per share were $1.42, down from $1.85 a year ago. Despite the lower quarter, Apple did beat expectations which has helped their share price in after-hours trading.

Apple Q3 2016 Financial Results (GAAP)
  Q3’2016 Q2’2015 Q3’2015
Revenue (in Billions USD) $42.358 $50.557 $49.605
Gross Margin (in Billions USD) $16.106 $19.921 $19.681
Operating Income (in Billions USD) $10.105 $13.987 $14.473
Net Income (in Billions USD) $7.796 $10.516 $10.677
Margins 38.0% 39.4% 39.7%
Earnings per Share (in USD) $1.42 $1.90 $1.85

Apple announced a dividend of $0.57 per share payable on August 11th to shareholders of record as of August 8th. They also returned over $13 billion during Q3 through share buy-backs and dividends, and they have completed almost $177 billion of their $250 billion capital return program.

iPhone sales are far and away the largest part of the company, and this quarter Apple sold 40.4 million handsets. That is down from the 51.2 million last quarter, and 47.5 million in Q3 2015, meaning iPhone sales were down 15% year-over-year. This resulted in revenue of $24 billion, down 23% from a year ago. It’s certainly a noticeable drop, and it shows just how successful the iPhone 6 was when it launched.

Moving on, iPad sales continued their slow and steady decline. Sales of the tablet were just a hair under ten million for the quarter, which is a drop of 9% year-over-year. Revenue was $4.9 billion, which is up 7%. A year ago, the average selling price of the iPad was $415, but this quarter, average selling price for the iPad rose $85 to $490. Declining sales of the iPad Mini, as well as new sales of the higher priced iPad Pro are certainly the case, but Apple doesn’t break out the numbers for individual models to know just how much each was a factor.

The Mac didn’t fare very well either, with unit sales of 4.25 million, which is down 11% year-over-year. This resulted in revenue of $5.24 billion, down 13%. With basically no Mac refreshes in a long time, they are no longer outperforming the PC market as a whole, which was the case for the last while.

Apple’s “Other Products” includes Apple TV, Apple Watch, Beats, iPods, and accessories, and while none of this is broken down by sub-category, the Other Products as a whole also fell 16% in revenue compared to Q3 2015, with revenues for this quarter of $2.22 billion.

Apple Q3 2016 Device Sales (thousands)
  Q3’2016 Q2’2016 Q3’2015 Seq Change Year/Year Change
iPhone 40,399 51,193 47,534 -21% -15%
iPad 9,950 10,251 10,931 -3% -9%
Mac 4,252 4,034 4,796 +5% -11%

The one segment in which Apple had strong growth was their Services segment. Services grew by 19% compared to Q3 2015, with revenue of $5.976 billion, which is up almost a billion or 19% year-over-year. Q2 2016 revenue was pretty much the same at $5.991 billion, meaning services have once again outpaced both Mac and iPad sales, and now represent the second largest segment at Apple.

Apple Q3 2016 Revenue by Product (billions)
  Q3’2016 Q2’2016 Q3’2015 Revenue for current quarter
iPhone $24.048 $32.857 $31.368 56.8%
iPad $4.876 $4.413 $4.538 11.5%
Mac $5.239 $5.107 $6.030 12.4%
iTunes/Software/Services $5.976 $5.991 $5.028 14.1%
Other Products $2.219 $2.189 $2.641 5.2%

Overall, it’s the second consecutive quarter of revenue loss, and last quarter was the first time that happened since Q1 2003, so Apple is in somewhat unfamiliar territory. Their guidance for next quarter is $45.5 to $47.5 billion, and margins between 37.5% and 38%. That guidance is also for a loss of revenue, since Q4 2015 had the company coming in at $51.5 billion, and 39.9% margins. It will be interesting to see if hardware refreshes in the fall can stop the drop in sales.

Source: Apple Investor Relations