5 THINK Conference Dreams, Themes, and Details

Dave Beulke

5 THINK Conference Dreams, Themes, and Details

As I said before the IBM THINK conference was an interesting event. Having several different presentation venues, picking sessions to attend made planning your day and travel between sessions/venues a challenge. Also since most of the sessions didn’t have the presentations available for viewing beforehand or within the THINK application so it was difficult to understand if the session was going to be technical or a sales pitch.

One task before the THINK conference that I highly recommend, is doing research on the speakers of your desired sessions. This pays off tremendously since it helps you develop a plan to attend and learn from the technical content that is available at the conference. Understanding the technical nature of the presentation material and the speaker perspective is vital for attending the correct session during a particular timeslot. Also since the 40-minute sessions were dispersed throughout several venues, getting to your second choice session in another building was almost impossible.

These are the main impressions and takeaways from the various keynotes, spotlight featured sessions, general attendee conversations, and attendance at the sessions.

  1. Quantum computing is coming. Going to the exhibit hall, seeing the quantum computer, and hearing from quantum researchers from IBM, Exxon, Mercedes Benz, and J.P. Morgan was very interesting and amazing. The current quantum computing power and the new exponential quantum power growth of capabilities truly will enhance and change the types of questions, analysis, and analytics that can be performed against issues and bigger data sets. Quantum dreams are becoming reality with the first working quantum computer, and its enhancements are being mapped out by IBM and its business partners. The long time promises of the quantum computing power are coming into reality, and the consensus predictions are that quantum super-computing will be taking the lead in as little as the next five years.

  2. AI and ML are the next game changing technologies. There were many technology presentations successful projects using AI and ML, frameworks, AI and ML coding activities, and programming of complex formulas. Many companies presented how Watson Anywhere (previously Watson Studio) was helping them manage and access local, hybrid, and multiple cloud such as AWS, Azure, Google, and IBM cloud configurations within their AI and/or ML projects. Some presentations detailed how Watson Anywhere can reference your favorite Java, C, R, or other code editor and interface with their related integrated open source compilers. Jupyter Notebooks and a wide variety of open source code and services can be integrated and referenced within Watson Anywhere.

    The AL and ML projects are having dramatic bottom line profitability impacts for all types of companies. According to Forrester Research Boris Evelson, presentation companies that develop these AI and ML “Systems of Insight” are 2.5 times more likely to grow revenue at 20 percent or more and 240 percent more likely to create sustainable competitive advantage. Testimonials to this type of impact were given during IBM CEO Ginni Rometty’s keynote with conversations with GEICO Insurance, Kaiser Permanente Healthcare, and Hyundai Credit Card.

    These CEOs talked about how their AI and ML systems are improving their business products and services, reducing their costs, and growing revenue. They also talked how AI and ML are helping their businesses move quickly and efficiently adapt to changing business and customer demands. The new systems are being highly optimized, servicing their customers better, and providing more opportunities for gaining deeper insight into a company’s business issues, costs, and improvement possibilities. These benefits are why the C-Suite executives and all IT departments around the globe are adding AI and ML into all of their business and application development processes.

  3. Cloud is so yesterday. The cloud infrastructures are centralizing applications within their definitions whether the clouds are local, remote, or hybrid models. This cloud centralization is bringing all the applications’ services and micro-services into the same domain space which may not be optimized for the business or the application performance. The speculation by some speakers and people I talked to at the conference was that the cloud centralized model may be eclipsed by a more decentralized process model. Since 5G, zero latency, and edge computing are maturing, the cloud centralized application services may provide better performance decentralized. If the processing is at the edge and distributed to different servers, different clouds, different parts of the network, the execution can be more customized for processing business rules and business specialties and executed faster. Time will tell, especially since not everyone is even in the cloud yet and full 5G is years away.

  4. Everyone is worried that AI and ML are going to be taking their jobs. Sure this may be true of some routine no-skills situations. Early adopters of AI and ML are seeing that the jobs that they thought could be replaced are being enhanced through the incorporation of AI and ML. For example, instead of a call center employee taking every customer call, now an AI or ML chat boxes may take care of routine customer situation requirements, leaving the customer representative to handle more complex and unique situations. Some call centers are bringing AI to listen to customer service representatives’ customer interaction and suggest courses of action or upsell opportunities to the representative or provide more details that can further enhance the customer experience. So AI and ML may not replace every job, but AI and ML will definitely change and maybe remove the drudgery of every job.

  5. It was interesting to listen to all the different speakers about their AI and ML efforts. One of the main ideas or themes that seemed to run through all of the sessions were that AI and ML efforts need to be based on a company’s core business values to be effective and successful. Just like any previous generations of data warehouses, analytics, or any “system of insight,” thorough business profit principles need to drive your AI and ML services or product improvement choices.

People and processes that are infused with new AI and ML techniques and base their AI and ML usage on their core principles and values will be the most successful. Using your company’s good data management and data governance practices is very important for setting up and maintaining your AI and ML environment, formulas and self-improving ML rules. These good management and governance practices will help you acquire the right data, develop the correct AI or ML self-learning routines, and audit the feedback to continuously improve your formulas, business, and profits.

The sessions highlighted that there are AI and ML types of business rules being used in your business today that have been around and optimized over many years or even decades. These existing business rule examples and their embedded technology can be a goldmine of insight into your business optimization. There are literally hundreds of AI and ML formulas that are available to be applied to your business. Choosing the correct algorithm based on the business situation and data available will make all the difference for your long term AI and ML success. Looking at the existing business rules can direct, expedite, and improve your AI and ML development and execution.

There were also sessions that explained and warned that their AI development/implementation was not as effective as their original processes. Some companies have modified their new AI and ML routines to reiterate over the transactions and continue to learn from on-going original processes to improve the AI and ML routines. Together the AI and ML routines learn from the ever expanding transactions and eventually become more efficient and effective as the original routines.

These five areas were the dreams, themes, and ideas from the THINK conference that helped bring a lot of great information to the thousands of attendees. Get ready as AI and ML are endorsed by your upper management at every opportunity.


Dave Beulke is a business strategist, systems architect and performance expert specializing in big data, data warehouses, and high-performance internet solutions. He is an IBM Gold Consultant, Information Champion, President of DAMA-NCR, former President of International DB2 User Group (IDUG), and frequent speaker at national and international conferences. His strategies, architectures, and performance tuning techniques enhance analytics, security, and performance so organizations can better leverage their information assets and save millions of dollars in time to market, CPU, development, and overall costs. Follow his blog at davebeulke.com or on Twitter here (@DBeulke) or connect through LinkedIn here (https://www.linkedin.com/in/davebeulke).


Dave Beulke

President – DAMA-NCR

IBM Gold Consultant

Inaugural IBM DB2 Information Champion

email: [login to unmask email]

   Blog:  www.DaveBeulke.com

 

 

Dave Beulke & Associates

a division of Pragmatic Solutions, Inc

Syspedia - Find, understand and integrate your data faster!

Anton Britz

Moving DB2 from zOs to zLinux
(in response to Dave Beulke)

Problem : I have to move 11 TB of DB2 data from zOs to zLinux with a team of about 8 people.

Guidelines : I did find the IBM Redbook called : Practical Migration to Linux on System z  created in 2009
What else ? I am speaking to a Rick from Cincinnati in Ohio today too about suggestions/software/concerns etc. etc. but any help/suggestions will be appreciated

https://www.youtube.com/watch?v=fB_ZlORvD4c

Anton

Philip Sevetson

Moving DB2 from zOs to zLinux
(in response to Anton Britz)
Anton,

Can we assume that this is mostly in one or two really large tables?

I’d extract the z/OS data using a DSNUTILB Unload, with delimited output, selecting by a range of distinct values (ideally, distinct values from the cluster key if you have one). Then move them across with Unix System Services, with a SFTP session connecting to the target server cluster; load the moved data and check values; repeat with the next range, etc. This will be a bit on the slow side, but shouldn’t require you to buy any special equipment or cable.

If you have severe time constraints, I don’t have as many options for you. (I’d defer to specialty network people about how to maximize transport speed.)

--Phil Sevetson

From: Anton Britz [mailto:[login to unmask email]
Sent: Wednesday, March 06, 2019 3:03 AM
To: [login to unmask email]
Subject: [DB2-L] - Moving DB2 from zOs to zLinux


Problem : I have to move 11 TB of DB2 data from zOs to zLinux with a team of about 8 people.

Guidelines : I did find the IBM Redbook called : Practical Migration to Linux on System z created in 2009

What else ? I am speaking to a Rick from Cincinnati in Ohio today too about suggestions/software/concerns etc. etc. but any help/suggestions will be appreciated


https://www.youtube.com/watch?v=fB_ZlORvD4c


Anton


-----End Original Message-----
**This e-mail, including any attachments, may be confidential, privileged, or otherwise legally protected. It is intended only for the addressee. If you received this e-mail in error or from someone who was not authorized to send it to you, do not disseminate, copy, or otherwise use this e-mail or its attachments. Please notify the sender immediately by reply e-mail and delete the e-mail from your system.**