#CallForCode Part I / 4 – Building Awesome Disaster Machine Learning Applications

#CallForCodet

CallForCode Part I / 4 - Building Awesome Disaster Machine Learning Applications for Dummies (Me && You)

Taking an indepth look at several new services with IBM's PaaS Platform and Salesforce Einstein in a unique use case, the CallForCode Disaster Relief Initiative

IBM Call for Code 

By Brandon Kravitz

callforcodelogoblog1

On August 19th the planet celebrated the United Nation's Humanitarian Day to raise awareness of first responders who provide aid immediately after disasters.

Call for Code aims to help improve the lives of those communities who are most threatened by natural disasters and in turn, will help ease and improve the important role that humanitarian agencies play.

Projects must utilize 5 myCloud Services and provide a need to disaster first responders. Sample projects included utilizing aerial drones that could provide topography insights into which areas need aid, uncovering deep learning image processing to treat and identify victims, while using the provided IBM "Weather Company API" to fuel disaster data and realness to the competition.

Things You Will Need

ItemLocationDescription
Salesforcelogin.salesforce.comDeveloper Edition of Salesforce with Analtyics Studio
IBM myCloudhttps://console.bluemix.net/catalog/IBM's Catalog of Services
Weather Company APIhttp://developer.weather.comRaw Data to Use for Competition CSV Data File and JSON Data from the API

Code Softly and Carry Good Analysis

In order to create a well designed supervised machine learning tool, one must ask meaningful questions and prepare or tidy data in a way that allows you to predict closer to your true answer.

datasci

  • Let's Ask Simple Questions From Data Points In Call for Code

  • How and WHERE do natural disasters occur?

  • Can we utilize a large dataset or complex API with Salesforce Einstein?

  • How does one build a natural disaster algorithm?

HIC MANEBIMUS OPTIME

In order to build an event driven disaster project, lets define a few metrics so that our datasets can be meaningful and perform to our scope:

Define Scope:Predict if a household disaster will occur...
Define Performance:Our target accurancy should be over < 70%
**Context: **Using IBM Weather Company Data we can predict with 70% or greater accuracy which homes will be effected by disasters.
Solution Data:Using Machine Learning Workflows to process and transform Weather Company Data to create a prediction model, this model must predict which homes will be effected by impending types of disasters.

As long as we keep our scope within reasonable limits we should be able to create two distinct objects, Disasters and Households.

Telling Your Datastory - Enterprise Architect Edition

Anyone in tech knows that now more than ever companies are changing their business requirements and compliance structure on a more frequent basis. With Deep Learning Forward Propagating Networks and constant advances in autonomous services; integrating data realtime is a competitive advantage any skilled developer should have in their wheelhouse. Working with Event Driven Architecture and on-demand data doesn't have to be overly complex, we will look at two classic platforms Salesforce and IBM Bluemix; apply Event Driven architecture (Platform Events) through several methodoligies and use cases.

IBM Call for Code is a great way for Advanced and Beginner Devs to build a cutting edge library of skills or just sharpen existing ones.

So you wanna be a Salesforce Enterprise Architect?

Let's take our proposed data types and transform them into some metadata!

First we are going to define a few items in Salesforce:

ItemDescription
Household (Account)Describes homes with Geolocation, Value, Income, Gender
Disaster__cDisaster Types with Geolocation, Intensity, Time
HouseholdDisaster__ePlatform Event that can return values and transform multiple objects

Create Household Disaster Platform Event

  • Create a Platform Event and a Long Text Area Field

disasterFields

Disaster__c Fields and Data Type

  • Create all these fields on Disaster Custom Object with the Correct Data Type

disasterFields

Implementation

Now that the fields and objects are in place, our implementation is fast and simple:

  • Utilize our platform event to create a disaster

  • Push Data from Outside Source into Salesforce (Part II)

  1. Copy the following code into Developer Console - (Ensure you have correct fields created for disaster!!!)

disasterFields

  1. Now that we have our Disaster class ready we can send data to our event channel and create disasters. Copy the following code into developer console (this trigger will produce disaster data on the event channel):

Verify Our Results and Next Steps

Cool we created some disasterous code!

disasterFields

You can test this data for now using the following raw JSON file and your favorite REST platform tester:

  • Log in to your Trailhead DE org.

  • Open a new tab and navigate to Workbench at https://workbench.developerforce.com/login.php

  • For Environment, select Production.

  • For API Version, select the highest available number.

  • Select I agree to the terms of service.

  • Click Login with Salesforce.

  • On the next screen, click Allow.

  • In the top menu, select utilities | REST Explorer.

  • Click POST.

  • Replace the URI with:

  • We will be taking a deeper dive at a brand new connector called APP Connect that will allow us to sync data to Salesforce with low overhead.

Part II...

Now that we have our households, disasters and events setup; lets prepare some data from IBM to send to Salesforce Einstein using some state of the art toolsets.... TBD

tbd

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.