Why forecasting is important

Many CEOs tell me they would seek more comfort and be more confident if they could keep better tabs on their financials. They have put their plans into place based on economic and market assumptions made a few months back, but will they sustain the continual pain barriers to maintain, and increasing growth?

Any company seeking growth in 2018/2019 would be wise to include a sensitivity analysis as part of the balance sheet forecast. There are many ways to book actuals, and financial teams may want to spend some time determining the best processes for their companies.

In either good times or bad approaching the future with a robust forecast is vital for all kinds of businesses, other considerations should also consider, Politics, Economics, Global Risks and Customer Behaviour.

Politics
The pollsters failed miserably to predict the outcome of the past two UK general elections, the Brexit referendum and the US presidential election.

It’s tempting to blame the influence of fake news posted on social networks, given that allegations of foreign interference via such media are rarely far from the headlines.
But Ian Goldin, director of the programme on technological and economic change at the University of Oxford’s Martin School, suggests that other forces are stronger.

“The growing extremism we’ve seen is part of a broader set of factors, of which the web is an amplifier, not a cause,” he says.
“Change is accelerating and our social-security safety net is weakening.

People are getting left behind more quickly and insecurity is growing. There is a distrust of authority and expertise. Because house prices, rents and transport costs have increased so much relative to their incomes, people are getting locked out of dynamic cities where unemployment is low, pay is relatively high and citizens are more comfortable with change and immigration.”

So where does that leave those whose job is to gauge public opinion and forecast electoral outcomes accordingly?

Earth image courtesy of NASA http://earthobservatory.nasa.gov/

Economics
The playwright George Bernard Shaw once said:

“If all economists were laid end to end, they would still not reach a conclusion.”

More than 120 years after he co-founded the London School of Economics, his wry observation has lost little relevance.

Paul Hollingsworth, senior UK economist at Capital Economics, agrees, noting that their profession has “taken a bit of a beating in recent years” for its failure to predict, among other things, the 2007-08 global financial crisis.

“More emphasis needs to be placed on possible ranges of outcomes and the associated probabilities, to enable businesses to plan for the worst but hope for the best,” he says.

Andrew Goodwin, lead UK economist at Oxford Economics, believes that “a premium on adaptability” is the smart way forward. He explains: “We find that the best approach is to combine sophisticated tools with expert insight and to identify alternative scenarios.”

Parikh, meanwhile, points to the value of “stronger intelligence-sharing and collaboration”, especially among SMEs.
Given that the Office for Budget Responsibility has dropped its 2018 GDP growth forecast from 1.6 per cent to 1.4 per cent, calculated circumspection – or what he calls “stress-testing organisations against an array of macroeconomic scenarios” – seems wise advice indeed.

Global risks
“In many respects it’s becoming easier to assess business-related risk owing to the increasing accessibility of open-source information and intelligence,” says Phil Cable, co-founder and CEO of risk management firm Maritime Asset Security and Training.

“Global competition has forced businesses to spread their wings and trade in places where they wouldn’t otherwise go. But assessing personal risks and employees’ safety, security and health concerns in places where western standards are limited is still challenging.”

The Ipsos Mori Global Business Resilience Trends Watch 2018 survey, conducted in partnership with International SOS, revealed that 42 per cent of organisations had altered the travel arrangements of their employees in 2017 because of risk ratings pertaining to security threats and natural disasters.

200 million people were connected in the late 1980s to one in which more than six billion people are connected. The silos we used to work in no longer apply. We can sell to places anywhere in the world, but there’s a downside – a pandemic can now cause a financial crisis, for instance. Hurricane Sandy, had it been bigger, could have led to a global crash.”

Customer behaviour
Forecasting how the public might spend its hard-earned cash is a far better-informed exercise than it ever has been.
So says Steve King, co-founder and CEO of Black Swan, a firm that predicts consumer trends using what he calls “the world’s most advanced database of consumer thought and opinion” – aka the internet.

“Never before have we lived in an age when so many people have shared so much information about themselves, or when this knowledge has been so readily accessible,” King says.

“It’s going to be incredibly interesting to see how the development of disruptive connected technologies such as the internet of things will change our behaviour in unexpected ways.”

To fully exploit the sheer volume of customer-related information to be found online, real-time monitoring and instant responses are imperative, he adds.

“Micro-trends are effectively created and destroyed almost overnight now. Brands must start moving with the times and away from qualitative future-gazing. They need to adopt new platforms that continually analyse social trends and offer quantifiable, robust predictions powered by artificial intelligence and machine learning.”

A final thought
Many companies do not understand the strategic importance of forecasting.

Having the right resources available at the right time is essential for efficient functioning.
In today’s tough business environment where businesses are trying to save costs it is needed that every penny is saved.
Forecasting is one way to save costs as from forecasting only companies can guess the future demand and can manage their resource accordingly. Any mismanagement in forecasting can lead to great loss in both small and large businesses.

All large companies use forecasting when formulating their strategy because without it no decisions can be made. It is true that no one can predict the future accurately, but forecasting can give a general idea about future on which present decisions can be made. Forecasting is therefore an important strategic tool for all businesses.

Paul Polman once said:
“Practically, systemic thinking can be used to identify problems, analyze their boundaries, design strategies and policy interventions, forecast and measure their expected impacts, implement them, and monitor and evaluate their successes and failures.”

Guest-blog: Neil Cattermull – ‘A Guide to Machine Learning’

Neil Cattermull

Because of new computing technologies, machine learning today is not like machine learning of the past. It was born from pattern recognition and the theory that computers can learn without being programmed to perform specific tasks; researchers interested in Artificial Intelligence wanted to see if computers could learn from data.

The iterative aspect of machine learning is important because as models are exposed to new data, they are able to

They learn from previous computations to produce reliable, repeatable decisions and results. It’s a science that’s not new, “but one that has gained fresh momentum”.

While many machine learning algorithms have been around for a long time, the ability to automatically apply complex mathematical calculations to big data “over and over, faster and faster“ is a recent development. Here are a few widely publicized examples of machine learning applications you may be familiar with:

• The heavily hyped, self-driving Google car? The essence of machine learning.
• Online recommendation offers such as those from Amazon and Netflix? Machine learning applications for everyday life.
• Knowing what customers are saying about you on Twitter? Machine learning combined with linguistic rule creation.
• Fraud detection? One of the more obvious, important uses in our world today.

Resurging interest in machine learning is due to the same factors that have made data mining and Bayesian analysis more popular than ever.
Things like growing volumes and varieties of available data, computational processing that is cheaper and more powerful, and affordable data storage.

All of these things mean it’s possible to quickly and automatically produce models that can analyze bigger, more complex data and deliver faster, more accurate results – even on a very large scale. And by building precise models, an organization has a better chance of identifying profitable opportunities – or avoiding unknown risks.

In the second of a two series blog (“Digital-transformation“), we welcome back Neil Cattermull – Public Speaker and Commercial Director living in London, United Kingdom and a public figure in writing about technology, and entrepreneurship, he is considered a global Industry influencer and authority within the tech scene.

Neil has travelled around the world assisting small to large firms with business models. Ranked as a global business influencer and technical analyst.

Neil has held directorship positions within technology divisions within the financial services market, such as Merrill Lynch, WestLB, Thomson Financial and I have created many small to midsize organisations.

Neil is going to discuss ‘A guide to Machine Learning’.

Thank you Geoff,
Machine learning is one of the most innovative and interesting fields of modern science around today. Something that you probably associate with things such as Watson, Deep Blue, and even the infamous Netflix algorithm.

However, as sparkly as it is, machine learning isn’t exactly something totally new. In fact, the concept and science of machine learning has been around for much longer than you think.

The beginnings of machine learning
Considered to be the father of machine learning, Thomas Bayes’ theorem was pretty much left alone until the rockin 50’s when, in 1950, famed scientist Alan Turing managed to create and develop his imaginatively named ‘Alan Turing’s Learning Machine’.

The machine itself was capable of putting into practice what Thomas Bayes had conceptualised 187 years earlier. This was a huge breakthrough for the field and along with the acceleration of computer development, the next few decades saw a gigantic rise in development of machine learning techniques such as artificial neural networks, and explanation based learning.
These formed the basis of modern systems being managed by artificial intelligence. The latter being arguably the most integral to the development of systems management technology.

Explanation based learning was primarily developed by Gerald Dejong III at the Chicago Centre for Computer Science. He essentially managed to build upon previous methods and develop a new kind of algorithm, enter the “explanation based algorithm!”

Yes, the explanation based learning algorithm was fairly standard in that it created new business rules based on what had happened before. However, what sets this apart as a breakthrough is that Dejong III had managed to create something that would independently be able to disregard older rules once they had become unnecessary.
Explanation based learning was one of the key technologies behind chess playing AI’s such as IBM’s Deep Blue.

A cold AI Winter
However, there was a period during the 70’s when funding was disastrously reduced because people had started thinking that machine learning wasn’t living up to it’s original billing.
This was compounded when Sir James Lighthill released his independent report which stated that the grandiose expectations of what artificial intelligence and machine learning could achieve would never be fulfilled.

This report led to many projects being defunded or closed down. This was incredibly unfortunate timing as the UK was considered a market leader when it came to machine learning. This dark period of time was effectively known as the ‘AI Winter’ and bar a momentary slip in the early 90’s, was the only real time that the possibilities of machine learning were ever really discounted by the scientific community.

Who is pushing the technology forward now?
Machine learning has reached a level now where companies such as DataKinetics have the capability to transform legacy systems into business driven analytics.

DataKinetics are at the forefront of their field and have been entrusted by many blue-chip companies, such as Nissan and Prudential, to streamline and optimize complex technology environments. With the advancements within technology today IT professionals are now capable of achieving so much more due to new innovations in machine learning.
However, this is just the beginning – if funding and interest into machine learning and AI remains consistent, there’s no telling what can be achieved.

Machine learning algorithms that can predict future outcomes, giving us – the humans – to react accordingly.

In essence, the main idea behind machine learning is that it’s essentially where a computer or a system takes a set of data created previously, applies a set of rules to it and provides you with an output that in that is more efficient.

In much the same way, there’s a cycle between the innovators and forefathers of machine learning and with the companies and groups of people that are doing it today.

That’s why companies such as DataKinetics are proud to be associated with such a rich and storied period of human endeavour.
Innovators are equally as important as pioneers, without innovation we have static evolution that does not progress our species further and we are staring at a near constant change in the tech space.

Datakinetics are innovators within technology and have had the foresight to predict the evolution of mainframe, machine learning and analytics with a tech roadmap spanning for over 30 years!

You can contact Neil Cattermull:
– LinkedIn: linkedin.com/in/neilcattermull
– Twitter: @NeilCattermull
– email: Neil.Cattermull@gmail.com

Guest-blog: Neil Cattermull – Digital Transformation and ‘Open source on steroids’

Neil Cattermull


The two words ‘Digital Transformation’ seem to be words that we hear and see everyday across internal discussions at main executive board and c-suite. But exactly why are these discussions important, and why should they be a priority?

Firstly, what is Digital Transformation and why is it so important?




Let’s start from the beginning. Heraclitus once said:
“The only thing that is constant is change,” – and this is very true and relevant today.

With major moves forward in technology and accessibility toward digital media in the past 10 years, people now view technology in a completely different way and also learn in a different way.
This has been a huge factor in creating a need for companies to evolve and stay relevant, transforming the way they run their business, and also train their staff.

With the general concentration span of millennials being much shorter than that of their predecessors, businesses must change the way they interact with millennial employees and customers.

If we look at this from an internal perspective too, we see everything from employee training to onboarding and productivity can be improved through digital transformation in the correct way.
It is important to remember though that digital transformation will generate some push back and resistance. This is very normal, and this is also why it is important to implement it in the right way.

Effectively, Digital Transformation is an ongoing effort to rewire all operations for the ever-evolving digital world, by adopting the latest technologies in order to improve processes, strategies, and the bottom line.

Digital transformation became a term, decades ago, and at that time largely meant digitising. But today, a company needs to leverage digital tools to be more competitive, not just more digital.

Going forward, companies will need to harness machine learning (ML), artificial intelligence (AI) and the Internet of things (IoT) to be pre-emptive in their business strategies, rather than reactive or presumptive.

And after that? We can only speculate. Technology is advancing at a faster pace than we can adapt to it. What is clear is that digital maturity is a moving target, which makes digital transformation ongoing.

Today I have the pleasure of introducing another Guest Blogger, Neil Cattermull – Neil is a Public Speaker and Commercial Director living in London, United Kingdom and a public figure in writing about technology, and entrepreneurship. He is considered a global Industry influencer and authority within the tech-scene.

Neil has travelled around the world assisting small to large firms with business models. Ranked as a global business influencer and technical analyst. He has held directorship positions within technology divisions within the financial services market, such as Merrill Lynch, WestLB, Thomson Financial and he has created many small to midsize organisations.

Neil is going to discuss ‘Open Source on Steroids’.

The words “digital transformation” are on the lips of every person in technology and tech media, as well as many business leaders – from company CIOs and CTOs to technology to business line managers to writers in news publications and tech blogs.

At its core, a digital transformation is the enablement of technologies and workplaces tuned to today’s digital economy. The beating heart of this digital economy is the API, and is being followed now by emerging technologies like IoT (Internet of things) and FinTech technologies like Blockchain.

Today the transformation of processes, IT services, database schemas and storage are proceeding at exponential rates with Cloud, AI and Big Data currently taking center stage as new ways of working in the enterprise.

The glue to the majority of developments in the technology world is the adoption, proliferation and acceptance of Open Source technologies. Community developments such as Hadoop, Apache Spark, MongoDB, Ubuntu and the Hyperledger project are some of the names that freely fall from any Open Source discussion.

The question is where do you run these workloads? How do you run these workloads? And in what form should these workloads take?

Most major companies will have mainframe systems at the core of their IT systems, so the question is really whether to run new workloads there, or on other platforms?

Any building architect will tell you that before tearing down the walls of an old house or doing any significant structural changes, you should always consult the original architectural plans. In the same way, any systems architect would look very closely at what a mainframe system is doing now before considering running workloads elsewhere.

However, it is imperative to understand the difference between mainframe and midrange server technology at a very high-level:

• Mainframe systems are designed to scale vertically not horizontally
• Input / Output is designed in mainframe systems to move processing away from the core and very fast I/O is built in to the core of a mainframe system, even at the hardware layer
• Centralized architecture is a key feature allowing mainframe systems to manage huge workloads extremley efficiently – catering for 100% utilisation without any degradation of perfromance
• Resilience is built in to every key component of a mainframe; redundancy at the core
• At a transactional level there is no other system that comes anywhere near the level of processing of data that a mainframe system can process.

An argument against the mainframe could be to decouple software systems onto commodity hardware or cloud systems; but this tends to create server and cost sprawl, particularly if an important goal is to mirror the mainframe’s performance, security, transaction throughput capacity, reliability, maintainability and flexibility.

But as we move further into the world of IoT, with databases and Big Data systems acquiring vast amounts of ingested social media and transactional data, how are we scoping the growth and security of these systems?
We are not if we simply just keep adding to existing IT infrastructures – we need to be able to scale access and throughput to manage, interigate and optimse the hottest commodity we have: data!

Data is becoming a currency in its own right but we need to secure this new currency in a way we do today with traditional moentary systems.
And perhaps the best way to leverage this valuable asset is via APIs that allow enterprises to take advantage of the mainframe investments already made – enter a LinuxOne Open Source ready mainframe that assists with creating a familiar Open Source tooling stack on steroids!

The argument here is that a digital transformation is more than just empty words and data thrown on cloud servers, it is a state of mind, and an architecture that should encompass current and future systems towards overall business goals.

At the heart of this goal is the end-user consumer, something that every system architect should be very mindful of; however, downtime and security are quite often understated when creating the initial framework for key infratsucture projects.

These key elements must be baked into every project and at the very core of future technology initiatives – something that the Open Source Ready LinuxOne infrastructure delivers extremely well.

You can contact Neil Cattermull:
– LinkedIn: linkedin.com/in/neilcattermull
– Twitter: @NeilCattermull
– email: Neil.Cattermull@gmail.com