Thứ Hai, 24 tháng 10, 2016

Is Brexit a Risk to US Growth?

mike smitka

I made a short presentation as a component of board education for a local bank. The president was curious about Brexit, so I used that as a point of departure. What follows are thumbnails of slides; I add several at the end that (as expected) I did not get to during my talk.

First, I began by emphasizing that there is no business cycle, as emphasized in a previous posting on this blog. By chance in the immediate aftermath of WWII the US had 3 recessions with similar timing, but that's not happened since. Just because we've gone 7 years without a recession doesn't mean that one is more likely in the next year. Furthermore, the apparent causes vary, so predicting on the basis of past recessions is pointless. Now once a recession has begun, then certain changes occur – but those can also arise without a recession being underway. So it is possible to calculate recession probabilities, but those are weak and at best provide information on the next several quarters.

Still, we can think about potential threats. Is Brexit one? A quick look at recent data suggest "no". The pound has depreciated by about 25% and because the UK is relatively "open" – trade is about 1/3rd of GDP – that will result in some uptick in consumer prices. But so far that's been modest (from 0% inflation to 1% inflation). Meanwhile, unemployment is falling, not rising, and there's no evidence so far of a slump in GDP growth. Meanwhile, interest rates remain at zero. No surprise there, that's been true of the developed world thanks to the Great Recession, and the stalwart refusal of most countries to use fiscal policy.

Even if British growth does slow, that has only a modest impact on us, the US, because the UK is a modest part of the global economy, at a bit under 3%. Now if problems there hurt the remainder of the EU, then that starts to change – the EU is bigger than the US, but it's still only about 1/6th of the world economy. If we are worried about flash points, then we really need to look at the rest of the world because growth in the developed has been slower than elsewhere. NAFTA comes to only about 20% of the world economy; add in the EU and Japan and the developed countries are now less than half of the global economy. That's good news, the more the better for everyone. But we need to pay attention to China. No more on that here

Meanwhile NAFTA as a whole is doing relatively well, with the IMF projecting 2017 growth in all three member countries at over 2%, higher than the UK or any of the larger continental economies.

We do however need to set concerns about the future in context. Due to demographics, growth in the US will not hit 4%, no matter what politicians do. The baby boomers are retiring, and so we have very low labor force growth. While we still have slack, we are now – at long last – approaching historic levels of labor utilization. Indeed, I'm slightly hopeful that with the numbers of those working involuntary short hours is now close to historic levels more of demand will serve to pull those prime-aged Americans who dropped out in 2008-9 back into the labor force. So we can continue to grow at 2% for a couple years, but then things will slow, independent of whether the Fed raises interest rates.

The graphs below present several snippets. The first is the "broad" measure of unemployment (the black line, U-6) relative to the "headline" level in green. It's still high, but at least close to the level of the better time periods since 1994. (Consistent data aren't available before then.) Second, we can see the loss and now addition of jobs, all relative to my calculation of the number of additional jobs needed to keep up with population growth, net of "boomer" retirement. The third graph is a variant on the second one, looking at jobs relative to my attempt to calculation a "normal" baseline. Finally, there's age-specific participation, which continues to be well below normal – if we extend back before 2007, these curves are all flat. (That does hide other dynamics, such as a drop male participation prior to 2007 that is offset by a rise in female participation.) This graph suggests that we have further to go until full recovery – I'm hopeful of end-2017, but project participation and it's 2019.

Now the Rockbridge Virginia region is not average; we depend more on retirement and tourism, and construction. The bad news is that for the nation as a whole housing starts have been falling relative to the population for the past 50 years. We may not see an uptick at the national level. But the good news is that the CoreLogic data on mortgages shows the share with negative or near-negative equity has fallen from almost half of all homes to a quarter. Most of that has been due to the drop of those with negative equity, either because over time the combination of higher housing prices and the repayment of loan principle has pulled them into the 80%-100% loan-to-value bracket, or because due to foreclosures there are now new owners. Since as a small bank you do hold some mortgages on your books, that is reassuring, and it's even better that net homeowner equity has risen by over $4 trillion since the start of 2013.

What happens locally is however more a function of whether retirees in nearby urban areas can sell their houses and buy a new one here. House prices in the mid-Atlantic region aren't uniformly above the bubble peak of late 2006, but Charlotte and Atlanta are close or positive, and the Washington DC area looks pretty good relative to much of the country. You can find all that data on FRED. Finally, my work is on the auto industry, and car (or more accurately) light truck sales are strong, but that does imply you should not expect much upside. With only one dealership left in the county car loans may not be a big part of your portfolio, but don't look at them to grow except as you increase market share. Production is strong, too, despite the bad-mouthing of NAFTA. The plants of Lear and Dana are long gone, so that's less relevant, but it is consistent with the picture of sales near their peak, reinforced by sales per person working back around 0.14 (scaled by 1,000).

Finally, what of interest rates? We should not expect the Fed to raise them quickly, or far. For the past 25 years interest rates have been falling. Some is that for the US as a whole inflation has fallen, and expectations seem to have gradually factored that in as a permanent change. Ditto in the major trading partners with whom we're linked financially, several of whom are actually experiencing mild deflation. In addition, it seems that despite all the hype around Silicon Valley, investors aren't expecting growth to pick up, either. Long-term interest rates are at 3.5%. We should take that with a dose of salt, as they've seldom been a good predictor of things 20 years hence. But if we return to 2% inflation, that means those in the bond market are factoring in growth of at most 1.5% (we surely need to deduct a risk premium, in which case growth and/or inflation must be lower).

Now I didn't include yield differentials. Those are quite volatile – the yield curve moves around a lot – and are neither high nor low today. So if the profits of a bank come from using short-term borrowing (including deposits) to fund longer-term loans, there's nothing to indicate conditions today are unusual. Have me back in a year and I can likely talk about the empirical evidence of interest rate normalization...

October 20, 2016

Autonomous Cars – Part 3: Technology Consolidation

Kaivan Karimi
SVP of Strategy and Business Development
BlackBerry Technology Solutions (BTS)


The amount of software in a car is mushrooming with there being 100 to 150 million lines of code in a modern car, which is more than most any other system.

Today cars are controlled via hardware electronic control units (ECUs) running the millions of lines of code.   60 to 100 ECUs are found in most newer cars today and that number is growing.  High end cars can have even more.  Reducing the number of ECUs in favor of reduced number of domain/area controllers is the new trend.  The idea is to reduce the complexities associated with software development, reduce the weight of the car, and reduce the overall cost. It also makes software upgradability less complex, where software functionalities can be enhanced to extend the life of a platform and offer a very large return on investment. 

Another benefit is that software can be more easily upgraded Over-The-Air (OTA) for minor or major fixes, respond to security issues, and provide other enhancements without the need to bring a car to the dealership.  This not only saves time, but also adds to the safety, security, and reliability of the car, while lowering the overall maintenance cost for the vehicle. According to research firm IHS, about 4.6 million cars received OTA software updates for telematics applications last year, and by the year 2022 forty-three million cars are expected to be using OTA services.  That is clearly a huge increase.
Some of the other technology components of Advanced Driver Assistance Systems (ADAS) are noted below:

Maps
While most people do not consider maps as a component of an ADAS system, in the future they will play a key role in assisting drivers to operate vehicles safely and adapt to driving changes based on location, such as changing what side of the road you drive on when you hit a border crossing. Maps provide a necessary input to augment the information that is provided by the various sensors in the car. This is not just macro-level geological data for finding directions, but also for augmenting functions such as camera-based traffic sign and roadway information detection, as well as infrastructure information. Cloud based processing will then be used to integrate the data sent by all vehicles into a global map that gets updated cooperatively by all drivers, including the road pot-holes to avoid, new roadway signs added, or rerouting due to construction.

Sensor Fusion
Sensor fusion means combining information and data from different sensors, leveraging the individual advantages of each sensor to complement and cover the weaknesses other sensors. The whole is greater than the sum of the parts, which means the individual sensors’ functions. This is very similar to what our brain does. You do not need to touch a pot of boiling water to know it is very hot, because your eyes to see the bubbling water and the steam on the top of the pot. In an ADAS system, the same thing happens: The sensor inputs are fused together for the ADAS domain controller to formulate a conclusive opinion about an event with better situational awareness, rather than just relying on a certain sensor’s data individually. This notion is at the heart of how any robot operates, but is especially important with the mission critical functionalities needed by connected autonomous cars.

HW & SW Roadmap to Consolidation
As the modern CPU increases in processing power, and decreases in electrical power consumption due to smaller process geometries, it would lead one to believe that consolidating multiple ECU functions onto one physical processor may result in significant cost savings. While that is true, consolidation needs to be balanced with a few important factors:

  1.  The increase in leakage current as semiconductor process geometries get smaller (this is a downside of Moore’s Law) 
  2. Thermal issues increase as clock speeds increase 
  3. The extent to which the software can be multithreaded to take advantage of new multi-core  processors.

The auto industry will be going through a transformation with ECU consolidation into single powerful multi-core processors that is similar to what happened in early 2000s in the networking industry.  At that time I had a front seat to the networking debate as I was driving some products a large semiconductor company. What happened was that most network and baseband processor semiconductor suppliers for both wired and wireless infrastructure business moved from single to dual to quad-core processors.  I remember a day when people were planning to pack as many as 80 cores into a single chip.
There is a huge difference between the software requirements for mutli-core processing in the networking and automotive industries.  

The elephant in the automotive room is the need to combine mission-critical with non-mission critical functionalities into the same processor, while separating and isolating these functions effectively from each other from a safety and security perspective. This single fundamental requirement becomes the basis for what types of software framework and architecture needs to be used.

Multi-core Processing
At a very high-level, all multi-core processors pack multiple processing units (cores) into a single    physical package—just like it sounds. But, this is where the similarities end. Other architectural factors come into play and determine the application fit, throughput, bandwidth, effective horsepower, and software architectures suitable for an optimal processing environment. Some of the considerations are noted below:

  • Choice and configuration of interconnect buses and shared memory schemes
  • Choice of homogeneous multi-core systems with identical cores sharing the same instruction sets, vs. heterogeneous multi-core systems with identical cores (some with same instruction set, and some with different ones 

  • Heterogeneous multi-core systems that mix different types of processor cores for application specific use cases (e.g. mix of MPUs, DSPs, GPUs, etc.).

  • Mix of the above core with localized memories and predefined high-level functions such as micro-coded engines and vector processors

  • Mix of cores and architectures that allow control and data path processing in a single core for communication applications 
  • Choice of architectural implementations such as VLIW, vector or multithread processors, fine-grain vs. coarse grain processors, etc.

The improvement in performance by using multi-core processors can only happen if the software running on the processor can take advantage of every last cycle that the multiple core device can offer. It also assumes that the interconnect buses and interfaces between the cores and the world outside of the chip, as well as between the cores, and the interaction between the cores and the memory architecture are properly modeled and designed for the end application, so that there are no design bottle necks introduced. 

This situation is analogous to adding multiple streets and multiple lanes in and out of a parking lot. If the electronic door to go in and out of that parking lot is too slow to accommodate the extra traffic, you will cause bad congestion, and the traffic throughput in and out of the parking lot would be as good as the speed of that electronic door. You may need to open the gate altogether, but have a traffic cop that coordinates the flow of traffic in and out of different entrances, into different parking spots. That is exactly what you would also need in the world of software, namely a traffic cop for the processes running in the given multi-core architecture. That is where a hypervisor comes in, which is to act as that traffic cop.

QNX offers a hypervisor and other safety- and mission-critical software for make connected autonomous cars safe reliable, secure, and trusted.

The next blog will address the hypervisor/traffic cop, and describe how they make the software-defined future more autonomous and safe.



_______________________________________________________________________________
Kaivan Karimi is the SVP of Strategy and Business Development at BlackBerry Technology Solutions (BTS). His responsibilities include operationalizing growth strategies, product marketing and business development, eco-system enablement, and execution of business priorities. He has been an IoT evangelist since 2010, bringing more than two decades of experience working in cellular, connectivity, networking, sensors, and microcontroller semiconductor markets. Kaivan holds graduate degrees in engineering (MSEE) and business (MBA). Prior to joining BlackBerry, he was the VP and General Manager of Atmel wireless MCUs and IOT business unit.


Thứ Tư, 19 tháng 10, 2016

Crossroads - INNOTRANS 2016


Terry Staycer
Global Business Development Manager 
BlackBerry






Readers of this blog might be interested in hearing how demands for software safety and security are growing not only in automotive, but in other transportation areas as well – specifically, the railway industry.
 

Last month in Berlin, the 11thannual largest global railway industry event took place.  It was a smashing four-day success in terms of attendance and powerful discussions.   I was honored to attend this event. Overall, the hot topics revolved around improving mobility issues, digitization in rail passenger and freight transport, and technology for digital services.  Safety and security  remain key points of concern.   

QNX's hardware partner, MEN Micro, introduced Internet on trains to ease passenger communication and increase convenience.  However, that comes with increasing risk in terms of bad actors' being able to hack into a rail network.  It is critical to ensure that rail systems are at their most secure and that there is no potential violation to a train.
 
Here is a summary of some other key takeaways from what is the leading trade fair for transport technology:

Evident Re-Focusing
There is a re-focusing of development regarding interlocking and signal control among many of the big rail players such as Alstom, Bombardier, GE, and many others.  Application code, hardware, electronics, and sensors are being outsourced. The rail industry is maturing like the automotive market.

SIL 2. All the way
Customers are pursuing requirements from European and Chinese regulatory commissions, and increasingly those requirements are emerging as SIL-2, and not the anticipated SIL-4.  With these lower Safety Integrity Levels (SILs), the level of system failure increases. Of course customers are still asking for SIL-4, but this is an interesting trend to note.

Security is Critical
Security is a maturing requirement.  At the recent Deutsche Bahn Cyber Security Congress security was a top priority, and it was a hot topic at Innotrans as well. Some of the questions emerging about the security include: If there is a cybersecurity violation, how long does it take to recover? And, how does one architect a system for resiliency to cyberattacks?

Fail Safe vs. availability

Fail safe is good, but high availability is a demand. This topic dovetails into the statement above. Systems must be available in a sense that requires redundancy and fail safe. QNX is well positioned to address this trend with a microkernel based operating system architecture that delivers high-availability and reliability, making it perfect for mission-critical operations such as rail safety. 

China and North America Expansion
Chinawas the most represented company outside of Germany.  The Chinese high speed rail network will span 25,631 KM by 2030. China will boast a total track length of 120,000 KM by 2020.  In addition, North America will invest over $9.8 Billion per year  towards modernization continuing until 2022. Signaling, locomotives, and rail cars have the highest priority.

It is exciting to watch these trends develop and see which new ones will emerge.   

Already looking forward to Innotrans 2017!