InterConnect 2017 Conversations: Jeff Spicer talks with Dez Blanchfield

Screen Shot 2017-04-01 at 3.13.23 am copy

Jeff Spicer is the gentleman on the right.

 

The following is a transcript of a “fireside chat” video with Jeff Spicer recorded at IBM InterConnect 2017 in Las Vegas ( USA ).

Click this link to watch the video => http://www.ibmbigdatahub.com

Jeff:                          

Hi this is Jeff Spicer. I am here in Las Vegas at IBM InterConnect, it’s the last day of InterConnect This has been a really amazing conference. As you know this is our cloud computing and infrastructure conference. We had some big announcements this week. Over 15,000 people have attended and we’ve had thousands of sessions, a couple of big announcements have come out of the week. And sitting with me today is Dez Blanchfield.

And Dez is a consultant with your own company in Australia, he has over 25 years experience working with many different industries and many technologies, you’ve got experience in data, big data, infrastructure, cloud computing, you name it, I think there isn’t a technology that you haven’t touched.

Dez:                         

That ages me very quickly of course.

Jeff:                          

You started when you were in junior high school, and actually we’ll get back to that because you did start in junior high school, which I think is really fascinating. But I wanted to start by asking you about what happened before the conference even began. So IBM hosted the Open Tech Summit this year …

Dez:                         

That’s right.

Jeff:                          

Before InterConnect. And as you know, IBM has been really embracing the open source community over the past few years, in fact, before that. And IBM sees real value in open source and being part of that community, fostering conversations and relationships with that community. So give me your thoughts on how it was to start a vendor conference with the Open Tech Summit.

Dez:                         

I think one of the biggest things that I’ve noticed in this entire recent pivot of reinvention by IBM is this whole adaption of the communities around open source. And day zero on Sunday, the afternoon of the Open Tech Summit I think was a really good blend of the made up slash birds of a feather who had seen the normal hackers and open source coders and code committers, and designers and the more propritary world that IBM sort of historically came from, but now has pivoted around. It was a great turn out, a fantastic venue, amazing collection of speakers. Proprietary code, open source code, all sorts of woks of life. We even had IBM’ers throwing t-shirts at us from the stage.

But it was a really great start to the overall conference because it broke the ice with we open source folk, it gave IBM’s team a chance to get in and mingle and get to know each other, it was a nice mix of social and business. And we actually got to meet a whole bunch of people we had never met in person because we’d only known them through Twitter or other things as well.

So I think it was a really good toe dip from a very large company to get into this open source community and then engage with that community and make it a very relaxed journey to actually get to day one, day two, day three around all the core messages. So I loved it, I hope to come back again.

Jeff:                          

Yeah I’m sorry I didn’t get hit in the head by a t-shirt but there’s always future events.

Dez:                         

I missed it by “that” much.

Jeff: 

There’s always a future time. So one of the themes of the conference this year has been interestingly about data itself. And I say that’s interesting because this is a cloud and infrastructure conference, that’s largely what it’s about. But Arvind Krishna, our SVP or hybrid cloud, David Kinney, our SVP for public cloud in Watson, they both addressed the idea that data should be the centre of your cloud. That data brings with it differentiated advantage for your business.

Dez: 

Absolutely.

Jeff:   

So when you’re thinking about infrastructure, you also need to be thinking about data and how you make it widely available. And you’ve been working with data and big data an infrastructure for years as a consultant. When you sit down and talk with companies about the value of data, first, do you have to convince them of that value and then second, what do you tell them? How do you help them get that differentiated benefit from their data?

Dez:                         

I think there are two key things that I would take out of that if that’s okay. The primary thing is being strengthened day after day through this whole event, that data now is the centre of our universe. And I coined the phrase awhile back, I don’t pretend that I invested it, but I’ve been using that phrase that data has gravity in the same way that dust particles in the universe are creating new planets formed and then all the sudden there was gravity.

When you get enough data together it does have gravity. You’ve got a large enough community base, if you’re got a large enough data set, that becomes a thing in its own sense. So I think companies have always had data, they’ve always had these assets, they’ve just never truly understood what was in them. And so it might be in HR, or finance, or engineering or design silo.

But often they haven’t shared the data or they haven’t aggregated or put it to a data link, and I think if anything in the last few days, people are going to leave, the 15 to 20 thousand people here, they’re going to leave with one very clear understanding and that is that more than ever, we now know, thanks to these great sessions you’ve been running, that data is the centre of our universe and we need to start thinking about data in a very different way.

It’s kind of like the new oil, it’s a really critical asset. We’ve often got too much of it, we hoard some of it. So we need to figure out, how do we go from the traditional database environments like the RDBM’s platforms like DB2 that we’ve been familiar with for decades, data has been around for a long time, to what we’re doing now in data lakes, and if they’re done wrong they become a data swamp, which is unfortunate and you might find alligators in them. The transition from the traditional database environment, which is not going to go anywhere, by the way, in my view.

I think a lot of people think that big data and Linux are going to kill the database, that is definitely not the case. We look at all the big banks, all the big airlines, they’re all running big databases and invariably on mainframes curiously enough. But there’s a path where data goes from one tightly structured format in a sense to slightly less structured in the data length format, and then in many times, into the cloud.

And the second part of that then sort of flows on naturally, where when we start putting data into some of the environments you provide like the Watson data platform and particularly when we start consuming services like the Watson machine learning service and the cognitive tools. When we can take our data sets and put them into a data like format and we can trust them to be hosted in the cloud, we can then use your governance tools you’ve made available to allow that data to burst into these new environments, so I don’t have to move all the data, just the bits I need.

And one of the biggest challenges is just getting them in plain English, whether it’s at the boardroom at the top end or getting it down to the techies and going toe to toe. They often had very solid views of what they thought data was, and maybe it’s columns and fields and rows and tables and model databases, but often now, when we think about data it’s social, it’s open data sources from government, it’s IoT sensory information.

If I’m thinking about how to get an ambulance from one side of the city to the other at a particular time of day, I want to take everything, I want to now traffic, I want to know social, I want to know weather. If I know that someone’s tweeted there’s an accident on the right, I should be able to pull that data and feed it back to that ambulance drivers and tell them to avoid that accident, or the system should tell me. So when I talk to this topic around whether it’s federal government, large enterprise, medium enterprise, start ups, I try to get them to understand that they actually have this asset that they have probably overlooked for too long and the take away from this week in particular around data has helped clarify that.

And I’ve grabbed a whole bunch of vocab and language from this week that I’m going to reuse about the value of data, getting insights of data, the tools we need to deliver those insights and extract those insights. It’s been a very valuable week from that sense and many ways and others, but that one in particular.

Jeff:                          

Yeah, there’s been a lot going on this week. You mentioned a couple of technologies, Watson data platform and a couple of others didn’t have a sort of cognitive technology built in, in some cases machine learning, in some cases more advanced cognitive capabilities. And then I’m curious about … You also mentioned the accessibility of data and insights. And I think the two are starting to become linked, in that to manage these big volumes of data that you’re talking about and to really begin to draw insights on those, and then to make those insights available and accessible, you need some sort of a system that really understands how to help you manage that data, how to help you draw insights out of that.

And I think that’s where the machine learning, in terms of algorithm selection through CAD’s cognitive assistant. For data that we announced a couple of months ago and then Watson data platform, which we announced last fall, that these are technologies now that begin to not just democratise data, but democratise insights and begin to assist with the entire data process. And I’m wondering, again, about how you see this and how you’re presenting these capabilities to your customers in bringing them insights faster and in managing all of these data sets in much more feasible ways.

Dez:                         

So I had a conversation yesterday with somebody on the IBM team whose name escapes me. But something came out of this conversation here we agreed that a lot of the technologies we’re talking about, if you’re a practitioner, if you’re either in the data wrangling business or the data analysis or if you’re doing high end statistics or if you’re a developer … One of the big shifts we’ve seen with what you’ve been announcing over the last couples years and particularly this week, is that we now have access to the tools to do the things we’ve been dreaming of for years, for decades.

To be able to take a data set and to sentiment analysis on it was a non-trivial exercise previously. Now it’s just a service we can consume, right? And I don’t have to move all the data there, I only have to make some of the data available and I can consume it at an API level. And as I mentioned before, one of the projects we saw where a couple of young fellows had built a tool for financial services and they showed me some of the tools they use, the Watson data platform to move their stuff, their data settings to securely work with and collaborate.

They’d used the Watson machine learning engine to then start to drive some insights from that data set. It turned out when I asked them what their backgrounds were and what their pedigrees were, neither were technical and neither were programmers. And they’d only really taught themselves enough Python to talk to the APIs and then feed it back into Watson.

And you mentioned CADs, the Cognitive Assistant for Data, I know I’m getting to make use of it as well … Allow them to take structure on structure data and actually ask the tool to describe it back to them. So in other words, I don’t really know what I’m looking at, but you’ve got all these pre-determined models. You know what a schema looks like, you know a data type, you know the difference between say a Tweet and an email. You know the different between a firewall log and a PDF file.

And it was interesting because it encapsulated most of what you were just talking about there in an actual live demo on the floor. So I probably spent about an hour and a half with them at one point and annoyed the hell out of them.

But I think in a day to day point of view, as a practitioner, what I’m looking forward to now is going home, back to Sydney in Australia and getting started on some of the tools you’ve announced this week. And some of the things I’ve been doing before, it took a long time before I had to codify it myself. I can actually now just consume it as a service.

If I want to apply a machine learning setting, if I want to find some form of grouping and do regressional analysis on a data set and find outliers and figure out why they’re outliers, I don’t have to codify that now. I could just consume it. In fact, at world of Watson last year, I sort of threw this term together where I said, I think what you’ve done more than anything in my view is you’ve taken the data science challenge and the coder of computing challenge and machine learning, you’ve kind of made it as simple as web mail.

And it might seem a little disingenuous of me for simplifying it. But when we thought about what happened with email for example, from routers and servers and so forth and storage, before you get a mail account. Now you just go in and sign up on a web page. And I think the data science experienced these things with some of the cloud tools [inaudible 00:12:09], that stuff’s just there.

Correct me if I’m wrong, but it’s like 25 bucks a month or something and I can sign up and try it out, or I think you can even get a 30 day trial. I think there’s a significant shift in accessibility and experience and it’s also lowered the barrier of interest. Once upon a time it was really difficult to build these egresses and get to the point use notebooks and python and R studio and securely host data. Well that’s no longer the case, you’ve removed all those barriers. So now the challenge for us is, have we got the right data? Or is it in the right state and what do we actually want to do with it?

So you kind of get back to that piece of Machine Learning where we can ask plain English questions, or NLP, natural language processing, of the data set, we can actually ask ourselves that question. What is the objective we’re looking for and then how do we ask the platform to do it for us.

Jeff:                          

And some of the tools themselves, too, which might have been difficult to download and use, difficult to setup. IBM is trying to streamline that process now with the new download and go capabilities.

Dez:                         

Love download and go. I think the download and go thing, again, it’s got rid of all that challenge of actually getting access to the tools and the ecosystem. The idea of spending eight months in a Government agency, installing Linux, and get python installed and get R studio or whatever it might be and get the Jupiter notebooks going, and only then can you get tot he point where you go, can I actually get a username and a password and can I copy my files and … Too hard, it’s just too hard, right?

And eventually you just become exhausted, you don’t want to do it anymore and you’re looking for other options. The download and go thing is a significant game changer for all of those tools and platforms, now I can click and point. I think it look less than three minutes in a live demo yesterday. I just see the click down mode, copy, click install, done, done, done, finished. And he opened it up and started playing with it.

And I realised, that’s not a canned demo, he just did that live. So I can do the same thing when I get home, I can log in and I can click a download, I want to get access to the preview. But when it’s released in GA, it’ll be a three minute click and go and I’ve got this whole ecosystem.

And when you roll that through all the other things you’re delivering in what is currently the cloud suite that provides me the ability to play with data and all those other tools like machine learning and cognitive. Huge game changer because I can do it on my desktop now, and I can play with it, I can learn with it, I can get experience and when I’ve got experience I can then start to think about how I’d deploy it large scale, and I could do it scale in the cloud, and I could consume this cloud at scale and only pay for the use.

Jeff:                          

One final question, I just want to take a step back. And we’ve been talking about technologies and new capabilities and what those bring to the business, but I wanted to ask about industries themselves and who’s being impacted by this first. And by this, I mean the shift to the cloud, some of the cognitive technologies and then beginning to unlock what’s in your data. You work across a wide variety of industries, right?

You’ve worked in public sector, and in government, you worked in telecommunications, you worked in education, travel and transportation, retail … Is there an industry or a set of industries that leads the pack in terms of applying some of these new technologies, understanding that there is value in their data and that we’ll be a differentiator for them in their business, and then applying these new capabilities, be they cognitive, be they machine learning, or even some of the capabilities that you were talking about in terms of just simply managing large volumes of data inside your fire wall and the merging that with external data to get insights that you might not otherwise get. Are there industries that you think are really leading the pack?

Dez:                         

Surprisingly there’s three that I did not expect. I think across the board, all organisations will, at some point, come up with this realisation they can’t disrupt themselves internally. But there were there that really surprised me. Federal government, number one.

I’ve recently been part of a project where we took PDF documents that were printed out and signed and sealed and put back in for a particular purpose and there are millions of these things annually. And then people view those PDF’s on a screen and then type it into a system. It’s just the craziest thing I’ve ever seen, but that’s just where thewy had it. To take those PDFs and tag particular sections and then take it from a semi structured to a very structured format and then train an algorithm on that to run through a streamlining platform and then put back into a structured form. I did not expect government to adopt that as quickly as they did.

There were a number of people who came into these organisations that described very clearly the benefits and they were open minded to it. I thought they would be as closed minded as you could imagine. I’ve always had this view that government didn’t like change, change was risk, risk equals cost or whatever it might be.

And yet federal government, particularly in Australia, but I know it’s a global thing, south east Asia, China, Singapore and Malaysia, I’ve seen it all over the place, but Australia in particular in my back yard. Federal government has adopted some of these things, we’re talking about so quickly that it was almost breathtaking for me. I’ve heard people asking to help them with rewriting cloud strategies across federal government agencies, writing policies for them to catch up with what they’ve already done, figure out how to develop a governance framework to catch the entire thing in case something falls through, and then run brown bag sessions, demos, proof of concepts … They’re putting machine learning against PDF documents by the millions. Did not expect that.

Banking I probably expected, but not at the pace I’ve seen. I think Australian banks have been leaders for a long time. They certainly make the most money out of me of the banks in the world for some reason. But banking and finance, particularly wealth management, took up big data and cloud analytics thing across the business. I can’t name any but probably the top three banks in Australia have all had some programmes working over the last three to five years, where they’ve gone from traditional data processing to big data and streaming analytics, to now they’re thinking about machine learning, they’re thinking about cognitive, and they’re running multiple streams. They’re going to fail and fail fast.

And I know for a fact that you’re already working with most of them, and the ones you’re not working with are sitting up and paying attention and figuring out why they aren’t working with you. So I think that’s a great new change. The other one that I did not expect, the third one, as logistics and transport. And I know you’re doing a really cool project with tracking containers on ships and so forth.

Logistics and transport got the value of the data they had, they just didn’t know how to use it. And so the who track and trace … Kind of the Uber-fication of logistics and transport. When you think of that Uber did, and they built a platform to move people around with taxis. I have a view that that was just a distraction, they want to take over the logistics market.

But there are a number of very big operators in south east Asia that I work with who understood the value proposition of tracking everything that was going through the business and collecting data around it and doing analysis of that data and then making that data available to partners. And even I think some of the data available to research, so that uni students could look and say, well if I take this [inaudible 00:19:04] shipping with that anonymized data set, or a tokenized data set in some way, I can come back and show you how to improve something in your business.

Those three are probably the last three I would have expected, and yet they’re the ones disrupting themselves as fast as they can afford to.

Jeff:                          

Interesting. Yeah, you’re right, I would not have expected government. I would have expected something that’s consumer facing, something retail …

Dez:                         

I was going to say the same. Retail. That was a big surprise, and I’m very excited because I’m a tax payer and if I can get them to save any money that’s my money they’re saving, hopefully to go to my children’s education.

Jeff:                          

Well Dez, thank you so much for joining us, I really appreciate it.

Dez:                         

Thanks for having me.

Jeff:                          

Dez Blanchfield of Gara Guru in Australia. Really appreciate it, thank you so much.

Dez:                         

Thanks so much for having me here, it’s been a real privilege and I mage some great people, I’ve loved every keynote session and can’t wait to do it again soon.

Jeff:                          

Great, thanks Dez.

Dez:                         

Thank you.

Dez Blanchfield

Dez Blanchfield is a strategic leader in business & digital transformation, with three decades of global experience in Business and the Information Technology & Telecommunications industry, developing strategy and implementing business initiatives. He works with key industry sectors such as Federal & State Government, Defence, Banking & Finance, Airports & Aviation, Health, Transport, Telecommunications, Energy and Utilities, Mobile Digital Media and Advertising, and Cyber Security. His focus is driving outcomes for organisations by leveraging Digital Disruption, Digital Transformation, Cloud Computing, Big Data & Analytics, Machine Intelligence, Internet of Things, DevOps Integration, Automation & Orchestration, App Containerisation & Micro Services, Webscale Infrastructure, and High Performance Computing. Be sure to follow Dez on LinkedIn ( http://linkedin.com/in/dezblanchfield ) and Twitter ( http://twitter.com/dez_blanchfield ).

You may also like...