Quantcast
Channel: SCN : Blog List - All Communities
Viewing all 2548 articles
Browse latest View live

Article: Oil & Gas: UAE moves to quash talk of OPEC emergency meet as oil slumps

$
0
0

The United Arab Emirates moved to quash talk of a potential emergency meeting of the Organization of the Petroleum Exporting Countries (OPEC) after Nigeria's oil minister said on Tuesday a "couple" of members had requested a gathering.

 

 

Benchmark Brent crude futures slipped towards $30 a barrel to a near 12-year low before rising slightly. They have shed almost three-quarters of their value since mid-2014.

Such market conditions supported an emergency meeting to review whether OPEC should change strategy, Nigerian Minister of State for Petroleum Resources Emmanuel Ibe Kachikwu told reporters on the sidelines of an energy conference in Abu Dhabi.


However, UAE Energy Minister Suhail bin Mohammed al-Mazroui later told the same conference the current OPEC strategy was working, adding that time was needed to allow this to happen -- perhaps between one and 1-1/2 years.

 

Follow this link to read the full article >

(Source: The Africa Report)


SAP HANA on Power with SUSE Linux Enterprise Server for SAP Applications

$
0
0

SAP HANA is available on IBM-Power8-based servers combined with SUSE Linux Enterprise Server (SLES) for SAP Applications.

 

This was SUSE’s message in August of last year, when it announced an important enhancement for IBM Power Systems for SAP HANA. The combination with SLES for SAP Applications offers valuable additional functions to SAP customers who rely on IBM Power8 processors when using SAP HANA.

 

SLES for SAP Applications is the leading Linux distribution for SAP solutions on Linux and the first operating system to support IBM Power Systems for SAP HANA. As part of longstanding cooperation, participating partners were able to realize IBM platform support. Since SAP HANA was previously only available on Intel-x86 computer architectures, this new development gives SAP customers more freedom of choice when designing, building, and operating SAP HANA or SAP HANA applications. In addition, IBM Power8 is specially designed for processing large volumes of data from transactions and analyses. At the same time, there is optimized access to data, which is saved and managed based on in-memory technologies. IBM Power8 for SAP HANA hereby achieves extremely high performance while using fewer cores. Power8 also offers useful functions for high availability, scalability, and providing corresponding resources/capacities for (native and virtualized) SAP HANA deployment.

 

HA Features Out of the Box

 

In combination with SUSE, which is the leading SAP HANA operating system platform for the IBM Power platform, the benefits of Power8 are increased further. One example is high availability (HA): with the SLES for SAP Applications operating system platform, IBM-Power-on-SAP HANA users get sophisticated HA features practically free of charge. HA was duly taken into account already in the early development phase of SAP HANA on IBM Power, whereby SUSE – as already with Intel-x86-based SAP HANA systems – provided HA expertise from SAP HANA developments. As a result, IBM-Power-on-SAP HANA users benefit from highly available, business-critical real-time analyses. They get a fast insight into their business situation, combined with a level of availability that they need for their workloads. This also applies to future application deployments such as cloud computing, real-time analytics, big data, Industry 4.0, or the lnternet of Things.

 

SLES for SAP Applications supports as of Version 11


SUSE supports Power8 for SAP HANA with SLES for SAP Applications as of Version 11, Service Pack 4, including High Availability Extension.

Revolutionizing NHL’s Fan Experience

$
0
0

In 2014, SAP and the National Hockey League reimagined the fan experience around statistics for hockey lovers worldwide. This is the story of how design supported the launching of the NHL’s new stats experiences in - the first phase in a multiphase, multiyear-partnership between SAP and the NHL.

Results at a glance:

  • >400,000 site views within the first 3 days
  • 749 million total online impressions
  • + 45% in engagement on stats site


/profile/56wRWcXVc6woELt61wog0I/documents/K70mCIwI2VZP3lAcV0EMGE/thumbnail?max_x=850&max_y=850

Opportunity: Retelling the Story to Hockey Fans

Since first taking to the ice in 1917 the National Hockey League (NHL®) has seen amazing growth. From the first puck dropped, the method of capturing, formatting, and distributing NHL statistics has evolved as much as the game itself. As the official cloud software provider of the NHL, SAP collaborated with the League on the transformation of the NHL's statistics platform on NHL.com to provide real-time insight for fans to be able to dive into the world of hockey like never before. But what are hockey fans’ desires? And what are the NHL’s objectives? "In order to have a go at any project you first have to understand the people you design it for", says Anthony, lead designer for the NHL project. Before beginning any design, the team began by deep diving into the world of hockey and the different ecosystems of the NHL. From players to teams, the NHL has an incredible amount of data within the hockey stats landscape. Our mission was to unlock the full potential of this vast wealth of information to bring the game to life in new and meaningful ways for fans. The aim was therefore to personally engage with people and talk to them to develop empathy for their needs and see the world of hockey from their perspective. "This was important to understand their broader motivations and wishes and not only their view alone on hockey statistics", he recalls. “The NHL loved the level of emersion that we took to really gain empathy with the fans that we were designing for”, adds Matthew one of the designers on the project team. Knowing the fan’s aspirations is a first step to a captivating design.

Approach: Diving Deep to Understand the Ecosystem

“The design processwas anything but linear. It included continuous refining of feedback and ideas”, Anthony recalls. Whether from fans, the NHL, or the team members from the Design & Co-Innovation Center in Palo Alto – revisions and iterations became the fuel behind design decisions. While designing, of course, the team never lost sight of their main target: Keeping everything grounded in the fans aspirations and building the whole experience around their needs. The team started with low-fidelity sketches to save time and effort. Ultimately, after extensive feedback sessions with fans and stakeholders the design iterated its way to a higher and higher fidelity finally converging to pixel perfect wireframes. The design team also created extensive, stand-alone design documentation to support the NHL development effort. This ensured that the vision and integrity of the designs would survive following the engagement.

/profile/56wRWcXVc6woELt61wog0I/documents/fRf2z7SyJzyxLtsE1AyRcq/thumbnail?max_x=850&max_y=850

Results: Revolutionizing the Fan Experience

"The project was a team effort across various SAP departments including Global Sponsorships, the Design & Co-Innovation Center, the SAP Implementation team, and many more", says Anthony. The teamwork helped the design team identify potential obstacles, which could then be addressed well before they became actual problems. The entire design process was one of constant collaboration and co-creation. Such a close working relationship allowed the team to quickly incorporate feedback and work through ideas, making sure the designs met all stakeholders’ objectives. This went beyond updating the visual look and feel but providing fans with a whole new set of tools to go deeper and deeper into the game. We introduced a brand new statistics homepage for fans to use as a launch pad into the NHL stats ecosystem.NHL.com/statswent live in February 2015, with a whole new fan experience on an engaging level never seen before. Since the launch of the website the site visits have increased in 25% with over 400K NHL.com/stats unique views within the first three days of going live. During the first week of launch there were a total of 107 unique pieces of coverage published, resulting in more than 749 million media impressions and 18 million twitter impressions, such as “Hockey nerds are going off today on my timeline as a result of the NHL's new stats initiative and I love it.” Due to this overwhelming excitement, the design project has also been submitted to the UX Award and the IXDA Awards. The video and the impressions on http://sapdesignservices.wix.com/nhl-stats-design offer even more insights into the design process and the partnership with the NHL.

The process for this redesign has really been a true collaboration between the NHL and SAP. We really wanted to give the user a clean experience which is easy to navigate and also give a level of interactivity.

Chris Foster, Director, Digital Business Development, NHL

InnoVention @ FSMDG

$
0
0

The Financial Services and Master Data Governance team celebrated SAP's spirit of innovation by organizing a six week-long event 'InnoVention @ FSMDG' from 25th September to 6th November 2015. The event was targeted to foster the culture of innovation among colleagues, at the same time expose them to new learnings and technologies. Teams vied with each other for the top 3 prizes.

20151106_094212 (2).jpg

The first stage of the event was to submit ideas on 'InnoVention @ FSMDG' Jam page. A total of 14 ideas were submitted with the entries spanning across multiple domains like HR, productivity improvements, IoT and Banking. The second stage was to ensure that the idea owners had sufficient resources to implement their ideas. The organizers brought the idea owners and the colleagues who were interested to work on ideas on a single platform. In this way, both parties interacted, formed teams and started implementation. Apart from normal project activities, the teams spent subsequent days on discussions, brainstorming sessions and learning.

20151106_122724.jpg

The ideas were judged by Vijay Seethapathy and Srikanth Gopalakrishnan. On the final day of the event, 8 teams presented their ideas with a demo of their ideas. The event was well received among the colleagues; they waited eagerly for the results. As with all innovation events, all participants and the audience took home multitude of learnings and realized the immense possibility of innovation. We reached an important milestone in our innovation journey at FSMDG and we intend to progress in the years to come.

20151106_121706 (3).jpg

2016 PB Conference - Call for Presentations!

$
0
0

 

Charlotte PowerBuilder Conference 2016

Call for Session Abstracts

  2016 CPBClogo


     Now is your chance to inspire the PowerBuilder Community by presenting a session at the 2016 PB Conference to be held once again in Charlotte, North Carolina. As a whole, the PB / Appeon Community can only get stronger by coming together and sharing ideas, know-how, and experiences. By showcasing your achievements in front of your peers – it will help us all grow!

    Your session length may be either 60, 90, or 120 minutes. Demonstration type presentations should be limited to sixty minutes; in depth technical discussions or detailed analysis/examples would be more appropriate for a 90 or 120 minute time slot.

Sessions need to be educational and technically focused. Your presentation materials should target the Intermediate PowerBuilder and/or Appeon level.

Speakers must be available to present during the entire span of the May 16-20, 2016 conference. The PB Conference will offer complimentary registration to all the approved speakers.

The deadline to submit a session for consideration is January 31, 2016. Acceptance notifications will be sent out no later than the end of February, 2016.

For more details, please visit the North Carolina PowerBuilder User Group website! 


Regards ... Chris

Identifying SAP Manufacturing Servers to Users at Logon

$
0
0

Hello,

 

With the New Year comes the opportunity to create a new blog

 

Many customers have multiple SAP Manufacturing Execution environments within their IT infrastructure.  Whether it is the SAP Manufacturing Execution QA or the SAP Manufacturing Execution Development server, it can often be difficult for users to identify which SAP Manufacturing Execution server they are logging into by the URL itself.

 

One solution to this problem, is to identify the SAP Manufacturing Execution servers via their login screens.

 

This blog will guide you through this process.


Whether you are logging into SAP Manufacturing; SAP MII or SAP Netweaver, the same SAP  Netweaver UME screen is presented. Therefore, the following changes only have to be made once for them to be reflected for all three applications.

 

There are two main visual elements of the login screen that could be changed.

 

  • The Text Image

branding-text.gif

 

  • The Branding Image
    branding-image (2).jpg

 

These images are referenced within Netweaver Identity Management.

 

Configuration>Identity Management>Configuration>Open Expert Mode

 

expert_view.png

To change these referenced images you will need access to your Netweaver server and save your new images.  The path is:

 

\usr\sap\J2E\J02\j2ee\cluster\apps\sap.com\com.sap.security.core.logon\servlet_jsp\logon_ui_resources\root\layout

 

To reference the new image(s), simply click modify in expert mode and change the default image name to the name of the new image you wish to reference.

 

Below I have changed the reference to a new image for the Branding Text.

 

modify_expert.jpg

When completed changes should then be saved.

 

This change is immediately available and doesn't require a restart of Netweaver

 

image_change.jpg

 

Thus the login page now shows the user which server they are logging into.

 

 

dev_logon.jpg

I hope this blog has been of some value

 

Steve

UI Masking - Focusing on Customer Success

$
0
0

The renowned author Seth Godin says:

Over-focus on quality.

Expectations go up.

Sales rise as a result of word of mouth and customer satisfaction.

More money is spent on quality.

Repeat.

UI Masking is a Repeatable Customer Solution from Custom Development which allows users to mask sensitive data on a screen by maintaining a few customizing entries. Sounds easy and requires extra ordinary effort from the team to keep it simple for the end users. Over the past year, we have focused on how we can improve the product to enable customer success in this niche area of Data Security.

First, let’s talk about some numbers

  • Total number of customers – 50+
  • Sales demos given to customers by the team in 2015 – 20+

We interact closely with our customers and potential buyers. This gave us huge insight into how we can improve our product. We identified a few challenges that customers face and decided to build our solution to make customer life easier. These were.

Focus on making installation process smoother:
Implementation of the Add-On requires multiple configurations and customers need to be on the correct Net Weaver SP. We noticed that a lot of customers would skip one or two checks and ended up asking for support and help while installing. To empower customers to identify gaps, we developed an Installation Check Report which provides current system status and pending action items to complete the implementation process. This has resulted in incredible value add as we reduced the number of installation related OSS and this also helped us in consulting assignments when we were implementing the solution for the customer.

Focus on making the customizing process simpler:
When configuring entries for masking, we need to maintain multiple entries manually for each data set. We enhanced our solution to maintain entries automatically at click of a few buttons. So instead of manually identifying which programs display sensitive data, the solution helps the customer in identifying such programs. This results in reduced implementation effort and faster go-lives for the customers.

Focus on quality and processes:
If we discover bugs during our day to day work, we resolve them without waiting for customers to raise OSS messages. This keeps us on our toes and the solution keeps on improving. We also created a Master Note which lists all important Notes customers need to apply. This became useful as not all Notes are delivered in our namespace as the solution requires modifications in other areas to work in its complete capacity. We immediately saw a drop in number of OSS asking which Notes to be applied.

Focus on covering new areas for masking:
As I mentioned before, talking to potential, new and existing customers provides incredible insight on industry trends and new requirements. This enabled us to do CRM Web UI masking Custom Development Projects for multiple customers bringing in more business to SAP.

Result
As we freed our time by automating and simplifying processes, we found ourselves working on higher value functions adding more value to the organization by doing new custom development projects and working on business development by providing consulting services and enabling multiple customers implement the solution successfully.

I would like to close this blog by saying that there is no single answer to improving your solution. As stakeholders, we need to understand limitations of the solution, focus on customer needs and then mix it with hard work to come up with innovative ways in which we can add value and bring success knocking on the door. Thank you.

HCP Iot service - Keyboard Led and Laptop control without RaspberryPi

$
0
0

Everyone knows SAP HANA Cloud Platform has provided many services e.g. - Mobile service , Document service. Every one has HCP trial account but most of us not explore every service or features of HCP Platform. I was exploring something on SCN and found HCP IOT service then i started exploring it  thanks to Rui Nogueira he had posted awesome blogs on HCP IOT.


After reading blogs and help.sap documents i decided to create demo app but issue was i didn't have RaspberryPi device . I was thinking can we use existing elements for HCP IOT service . I got idea to control Keyboard Led and Laptop using HCP IOT and UI5 App so i've created this simple demo.


Initial phase my design was -

IMG_20160120_153611.jpg


Used 3 components -


1) Python program - Raspberrypi.py

2) HCP Iot service configuration

3) UI5 app for mobile

rpi.JPGmmscockpit.JPG


mobileapp.JPG

 

Final Execution - Watch this video

 

 

I'll update source code in my github and more information for implementation this simple HCP Iot demo in my blog series.

 

If anyone doesn't know about -

HCP Iot - https://help.hana.ondemand.com/iot/frameset.htm?4ab3521d055f41e9bce8837d4abbc09d.html

Raspbeery Pi you can find here https://www.raspberrypi.org/

Internet of things - https://en.wikipedia.org/wiki/Internet_of_Things

 

In my next blog will show you more realistic demo on HCP IOT with Raspberry pi  


Thanks for reading my blog

 

BR

Ashish



How to reduce your HANA database size by 30%

$
0
0

I didn't write enough blogs last year, and felt like I abandoned SCN a bit. Lately a few people have kindly commented that they enjoyed reading my content, which is being nicer to me than I deserve. So here's a little gift to the year off.

 

This script is only useful if you have a HANA system which was installed with an older revision (SPS01-07) and you have upgraded it a bunch of times and it's now on a newer release (SPS08-10).


In that scenario, it's possibly the most useful thing you will see all year for a HANA devop. In a productive HANA system we saw disk footprint reduction from 2.9TB to 1.89TB and in-memory footprint reduction by over 100GB. It will also substantially decrease startup time, decrease backup time, and increase performance.

 

What happens is that HANA chooses the compression type of a column store object when it creates it, and only occasionally re-evaluates the compression type. In older databases that have a lot of data loaded since the initial installation, it can mean that the compression is suboptimal. In addition, objects can be fragmented and use more disk space than is really required.

 

This script takes care of all that and cleans up the system. It takes some time to run (18h in our case).

 

A few caveats (these are general best practices, but I have to point them out)!

 

  • Run this script in a QA system before production, for test purposes and so you know how long it will take
  • Run it at a quiet time when data loads are not running
  • Ensure you have a full backup
  • Use this script at your own risk, like any DDL statement it could cause issues
  • Do not restart HANA during this operation
  • Complete a full backup after the script, and restart HANA to reclaim memory

 

Thanks to Lloyd Palfrey, who actually wrote it.

 

-- -----------------------------------------------------------------

--  HANA Reorg Script - 2015

-- -----------------------------------------------------------------

--  HOW TO RUN:

--  After creating the procedure you can run it with this statement:

--  call "_SYS_BIC"."dba_reorg"('INSERT_SCHEMA_NAME_HERE');

-- -----------------------------------------------------------------

 

CREATE PROCEDURE "_SYS_BIC"."dba_reorg"(IN pi_schema VARCHAR(60)) LANGUAGE SQLSCRIPT AS

CURSOR c_cursor1 FOR

SELECT TABLE_NAME

FROM M_CS_TABLES

WHERE SCHEMA_NAME = :pi_schema;

 

-- Recompress Tables

BEGIN

   FOR cur_tablename AS c_cursor1() DO

   EXEC 'UPDATE "' || :pi_schema || '"."' || cur_tablename.TABLE_NAME || '" WITH PARAMETERS ('OPTIMIZE_COMPRESSION'='FORCE')';

END FOR;

END;


-- End of stored procedure

 

-- Reorg Rowstore

EXEC 'ALTER SYSTEM RECLAIM DATA SPACE';

 

-- Trigger Rowstore GC

EXEC 'ALTER SYSTEM RECLAIM VERSION SPACE';

 

-- Create Savepoint

EXEC 'ALTER SYSTEM SAVEPOINT';

 

-- Reclaim LOG space

EXEC 'ALTER SYSTEM SAVEPOINT';

 

-- Reclaim DATA space

EXEC 'ALTER SYSTEM RECLAIM DATAVOLUME 110 DEFRAGMENT';

How Being First in Fair Trade Was All in the Tea Leaves

$
0
0

Established in 1997 as an association of three producer networks, Fair Trade International currently handles 19 national labeling initiatives around the world. Today there is rapidly growing awareness and demand for Fair Trade Certified products. A big reason for the geometric growth in demand is that purchasing Fair Trade labeled goods empowers consumers with the ability to obtain a great product while reducing poverty through the free markets – a veritable win-win.

 

According to a market research survey published in 2015 by GlobeScan, the Fair Trade label is the most top-of-mind, ethical, and environmentally friendly product label in the world. Thousands of consumers in 16 countries around the world participated in the survey, and of those 80% also said the Fair Trade mark would have a positive impact on their perceptions of brands.

 

A few other facts about the growth of the Fair Trade movement:

  1. There were over 1,800 Fair Trade Certified producer organizations in 2015, compared to 827 in 2009.
  2. Close to two million workers and farmers benefit from Fair Trade sales.
  3. In 2015 there were 420 certified Fair Trade producers in Africa alone.

The number of Fair Trade labeling organizations has grown rapidly around the world in just the last few years. However, back in 2000 and long before the recent demand for certified products began to intensify, Choice Organic Teas of Seattle, Washington became the first tea crafter in the United States to offer Fair Trade Certified tea.ChoiceTeas2.jpg

 

Choice Organic Teas works closely with international growers from the tree-lined hills of the Thotulagolla Tea Garden in Sri Lanka to the Makaibari Tea Estate at the foothills of the Himalayas, and the company is widely credited with greatly expanding awareness of the Fair Trade form of trade. In fact, they continue to offer more varieties of Fair Trade Certified Tea than any other tea company in North America.

 

Not surprisingly, Choice Organic Teas is also North America’s number one selling certified organic tea line. As the company grew to include more varieties, more employees, and more facilities, their back-end systems needed to improve to keep up. They tried for a long time to make due with basic accounting systems, but there were too many redundancies in their processes and the margin of error was too high.

 

The company searched for a cloud-based ERP solution it could trust to manage accounting, production, and inventory. The company experienced what VP of operations, Ray Lacorte describes as, “a few false starts,” when trying everything from out-of-the-box solutions to unreliable basic spreadsheets. Finally, at the end of a long road of trial and error, there was a bright spot. Fortuitously, the company turned to SAP and SAP gold partner, Softengine, to help revamp its reporting and analytics tools. The result was a system that tracks everything related to inventory, from ingredients to sales performance.

 

“We feel like we’re in great hands with Softengine and the SAP Business One application they installed. It’s nice to finally put all those unreliable ERP systems and redundant procedures in the rear-view window – and we ‘ve barely scratched the surface of what SAP Business One can offer us.”

 

Ray Lacorte, VP of Operations, Choice Organic Teas

 

Choice Organic Teas surely had no way of knowing so many years ago that principles steeped in an environmentally-friendly and ethics-driven Fair Trade philosophy were going to begin to become so important to consumers when making purchases today.

 

Ultimately, though, doing the right thing always ends up being the best decision – and the endearing success of this great American tea company is living proof of just that.

Visualizing Data with Jupyter and SAP HANA Vora, Part 1 / 2

$
0
0

In this blog post, we will explain, how to set up Jupyter as a browser-based frontend to easily query and visualize your data.

 

Jupyter is a web application that allows you to create and share documents that contain live code, equations, visualizations and explanatory text, see Project Jupyter.

 

This tutorial consists of two parts.

You are currently reading part one, which explains the basic steps how to set up and configure Jupyter.

It is essential to complete part one before continuing with part two!

 

Part two demonstrates how to run queries in Python and how to visualize data using matplotlib.

 

 

 

Prerequisites

 

Before starting this tutorial, please make sure your cluster is up and running.

You should have at least once started the spark shell and run some queries to test its functionality.

 

To complete part 2 of this tutorial, you need sample data, which can be downloaded here:

https://mdocs.sap.com/mcm/public/v1/open?shr=LMV6pH_012dtA13N-rtiwGZUAnulqt2zX4MSfGXQ51w

This file contains TPC-H sample data at scale facto 0.001.

Please download the file and extract its content to your HDFS.

 

Alternatively, you may generate the sample data on your own by downloading and compiling DBGEN:

http://www.tpc.org/tpch/tools_download/dbgen-download-request.asp

 

 

Installation

 

To get startet, we need to install several packages, that should come bundled with your Linux distribution.

Please run the following commands on a RedHat-based machine:

sudo yum install python-pip
sudo yum install python-matplotlib
sudo yum install gcc-c++
sudo pip install --upgrade pip
sudo pip install jupyter

You may install Jupyter on a jumpbox outside the cluster, for example, on an Ubuntu-based system.
Then, the first two commands are slightly different:

sudo apt-get install python-pip
sudo apt-get install python-matplotlib
sudo apt-get install g++
sudo pip install --upgrade pip
sudo pip install jupyter

 

 

Environment

 

Next, we need to set some environment variables to inform Jupyter about our Spark and Python settings.

Please adjust the paths and version number below according to your local environment, then either run these commands on the shell as the "vora" user, or put them in your ".profile", to have them loaded every time you log in:

export PYTHONPATH=/home/vora/vora/python:$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip
export ADD_JARS=/home/vora/vora/lib/spark-sap-datasources-<version>-assembly.jar
export SPARK_CLASSPATH=$ADD_JARS
export PYSPARK_SUBMIT_ARGS="--master yarn-client --jars $ADD_JARS pyspark-shell"

 

 

Configure Jupyter

 

Please run this command as the user "vora" to generate the initial configuration for Jupyter:

jupyter notebook --generate-config

Now, open an editor and edit the file "~/.jupyter/jupyter_notebook_config.py"

Since we are running on a remote machine with no Window Manager, we configure Jupyter to not open up a webbrowser on startup.

Please uncomment the line

# c.NotebookApp.open_browser = False

Uncomment means removing the pound sign at the beginning of the line.

 

To be able to access Jupyter from remote, we need to uncomment the following line as well:

# c.NotebookApp.ip = '*'

Notice: This will give everyone access to the Jupyter webinterface.

In a production environment, you might want to set up access control.

Please refer to this guide, how to secure your Jupyter installation:

Securing a notebook server

 

After applying the above changes to the config file, please save your changes and close the editor.

 

Notice:

Usually, cloud providers and IT departments are very restrictive and may block access to Jupyter's TCP port (default: 8888).

Please make sure to include a rule in the firewall configuration allowing access to the port on the machine running Jupyter.

Consult the provider's documentation or your IT department for details.

Running Jupyter

 

To run Jupyter, first, create an empty folder where you want to store your notebooks, and go into that folder.

Then run the following command as the user "vora", e.g.:

mkdir notebooks
cd notebooks
jupyter notebook

This will start a Jupyter notebook server, listening on port 8888 for connections.

The console output will be similar to this:

[I 09:39:29.176 NotebookApp] Writing notebook server cookie secret to /run/user/1000/jupyter/notebook_cookie_secret
[W 09:39:29.200 NotebookApp] WARNING: The notebook server is listening on all IP addresses and not using encryption. This is not recommended.
[W 09:39:29.200 NotebookApp] WARNING: The notebook server is listening on all IP addresses and not using authentication. This is highly insecure and not recommended.
[I 09:39:29.204 NotebookApp] Serving notebooks from local directory: /home/d062985/notebooks
[I 09:39:29.204 NotebookApp] 0 active kernels
[I 09:39:29.204 NotebookApp] The IPython Notebook is running at: http://[all ip addresses on your system]:8888/
[I 09:39:29.204 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).

Now we can fire up a webbrowser on another machine and navigate to the URL of the host running Jupyter, e.g. http://jumpbox.yourcluster.internal:8888/

You should see a website like this:

0.png

 

By clicking New, you can start a new notebook, that is waiting for your input:

1.png


After clicking, the empty notebook will open up:

2.png

 

Now, we can start submitting queries by entering the query into a paragraph and hitting the play button on top.
This will then execute the snippet in the background and return results to the webpage.

 

 

Submitting queries and plotting data

 

The final part of this tutorial will take place in Jupyter.

Please download the attached Jupyter Notebook "PythonBindings.ipynb.zip", unzip it, and copy it to the notebook folder on your machine running jupyter.

Then, open the file in the Jupyter webinterface in your webbrowser.

Create Sales Order using BAPI with Variant Configuration

$
0
0

The purpose of this Blog is to describe the Variant Configuration in Sales Order Creation. Before posting this blog, I have referred many SDN blogs, threads and other sites for any sort of help ....But no success.


Variant Configuration:

         

      It is a tool which helps to simplify the complex manufacturing of final product with more varieties and Variation of the input Material.


Example of industries relevant to SAP VC:  Auto mobile Manufacturing, Furniture Manufacturing, Aircraft Manufacturing, Personal Computers,Elevator Systems,Bicycle,Cars, Motor Cycles, Pumps etc.

 

Variant Configuration is for manufacturing complex products in which customer determines the features of the product and it also helps the customers or salespersons to put together specification for the product. Objective of variant configuration is to react quickly to customers’ requirements.

                   

         Here it needs not to create separate Material for each variant of a product. When companies introduce variant configuration this often goes beyond a business process re-engineering project.

 

Variant Configuration offers an opportunity to restructure product structures for which then processes are defined. This has a direct impact to the core areas such as marketing and product data management.


               Variant configuration is useful if you have large number of combination of parts that go into a product. It means different Permutations and Combinations of the parts for same material.

                   

For Examples:

  • In a business involving steel manufacturing, the customer may order steel involving different physical properties like tensile strength, diameter, colour etc.
  • A Customer ordering a Motor Bike can choose different combination of accessories and colour.

 

To define the features of a configurable material, you use Characteristics. To enable you to use Characteristics to configure material, you allocate the material to a class of class type 300. Each Configurable object must have a configuration profile. The configuration profile for a material controls the configuration process in the Sales Order.

                

          You use Dependencies to ensure that only allowed combination of features are selected. Dependencies also select exactly the right BOM (Bill ofMaterial) components and operations to produce a variant.


I have a requirement to create sales order with variant configuration materials, in this scenario I am used function module BAPI_SALESORDER_CREATEFROMDAT2.


          The usage of this BAPI is very simple when used to create sales order that do not used configurable materials. But when it comes to creating sales order using variant configuration materials, the logic of filling the structures of this BAPI is a little bit complicated.


There is a SAP note549563- BAPI: Filling configuration structures which explains how the Configuration structures in the BAPI need to be filled?


              For updating variant configuration (VC) data for sales order item, we need to populate below tables of standard function module or BAPI (e.g. BAPI_SALESORDER_CREATEFROMDAT2).


Sales Order Item Data


ls_item-itm_number       = '000001'.
ls_item
-po_itm_no        = '000001'.
"Should be fill this field
ls_item
-target_qty       = '1'.
ls_item
-material         = 'MAT12'.
APPEND ls_item TO lt_item.


Why the PO_ITM_NO field should be filled because the field POSEX (PO_ITM_NO) to define the connection between the sales order item and the configuration (BAPICUFG-POSEX) (ls_item-po_itm_no = w_sales_cfgs_ref-posex).


     If the item number is 000001, for example ls_item-po_itm_no = 000001 and ls_itemx-po_itm_no = 'X'. so that the configuration is called.


Sales Order Item Data Flags


ls_itemx-itm_number           = '000001'.
ls_itemx
-po_itm_no            = 'X'.
ls_itemx
-target_qty           = 'X'.
ls_itemx
-material             = 'X'.
APPEND ls_itemx TO lt_itemx
.


Schedule Line structure


ls_schedules-itm_number = '000001'.
ls_schedules
-sched_line = '0001'.
ls_schedules
-req_qty    = '1'.   
ls_schedules
-req_date   = '20160120'.
APPEND ls_schedules TO lt_schedules
.


Fill schedule line flags


ls_schedulesx-itm_number      = '000001'.
ls_schedulesx
-sched_line      = '0001'.
ls_schedulesx
-updateflag      = 'X'.
ls_schedulesx
-req_qty         = 'X'.
ls_schedulesx
-req_date        = 'X'.
APPEND ls_schedulesx TO lt_schedulesx
.


The Mandatory Structures filling for Variant Configuration in BAPI are:


    1. ORDER_CFGS_REF
    2. ORDER_CFGS_INST
    3. ORDER_CFGS_VALUE
    4. SALES_CFGS_VK


ORDER_CFGS_REF and ORDER_CFGS_INST tables should have one record per item and combination of CONFIG_ID and ROOT_ID should be unique across line items.


* Filling Configuration Reference Data SALES_CFGS_REF Table
w_sales_cfgs_ref
-posex      = '000001'.”ItemNumber
w_sales_cfgs_ref
-config_id  = '000001'.
w_sales_cfgs_ref
-root_id    = '00000001'.
w_sales_cfgs_ref
-complete   = 'T'.                “GeneralIndicator
w_sales_cfgs_ref
-consistent = 'T'
.  

   APPEND w_sales_cfgs_ref TO lt_sales_cfgs_ref.
CLEAR w_sales_cfgs_ref.


* Filling Configuration Instances SALES_CFGS_INST Table
w_sales_cfgs_inst
-config_id       = '000001'.
w_sales_cfgs_inst
-inst_id         = '00000001'.
w_sales_cfgs_inst
-obj_type        = 'MARA'.
w_sales_cfgs_inst
-class_type      = '300'.
w_sales_cfgs_inst
-obj_key         = 'KL'.         “MaterialNumber
w_sales_cfgs_inst
-quantity_unit   = 'LF'.
APPEND w_sales_cfgs_inst TO lt_sales_cfgs_inst.
CLEAR w_sales_cfgs_inst
.


ORDER_CFGS_VALUE and SALES_CFGS_VK tables we can have multiple characteristics for a material, in that case appropriate records should be inserted and combination of CONFIG_ID and ROOT_ID should be unique across line items.


*Characteristics Values Filling

* ColorCode
ls_sales_cfgs_value_in
-config_id = '000001'.
ls_sales_cfgs_value_in
-inst_id   = '00000001'.
ls_sales_cfgs_value_in
-charc     = 'ZCOLOR'. “Characteristic Name
ls_sales_cfgs_value_in
-value             = 'CHBC'.   “Characteristic Value 
APPEND ls_sales_cfgs_value_in TO lt_sales_cfgs_value.
CLEAR ls_sales_cfgs_value_in
.


* Gauge
ls_sales_cfgs_value_in
-config_id    = '000001'.
ls_sales_cfgs_value_in
-inst_id        = '00000001'.
ls_sales_cfgs_value_in
-charc          = 'ZGAUGE'.        “Characteristic Name
ls_sales_cfgs_value_in
-value          = '24'.           “Characteristic Value
APPEND ls_sales_cfgs_value_in TO lt_sales_cfgs_value.
CLEAR ls_sales_cfgs_value_in
.


*---Filling Configuration Variant Condition Key SALES_CFGS_VK

* ColorCode
         ls_sales_cfgs_vk-config_id    = '000001'.
          ls_sales_cfgs_vk-inst_id        = '00000001'.

           ls_sales_cfgs_vk-vkey           =  'ZCOLOR'.               “Characteristic Name
APPEND ls_sales_cfgs_vk TO ex_cfgs_vk.
CLEAR : ls_sales_cfgs_vk.

 

* Gauge
ls_sales_cfgs_vk
-config_id       = '000001'.
ls_sales_cfgs_vk
-inst_id          = '00000001'.
ls_sales_cfgs_vk
-vkey             = 'ZGAUGE'.    “Characteristic Name
APPEND ls_sales_cfgs_vk TO ex_cfgs_vk.
CLEAR : ls_sales_cfgs_vk.


Finally call your BAPI


  CALL FUNCTION 'BAPI_SALESORDER_CREATEFROMDAT2'
EXPORTING
      order_header_in    
= ls_header
      behave_when_error  
= 'P'
IMPORTING
      salesdocument      
= lv_vbeln
TABLES
     
return                                  = lt_return
      order_items_in     
= lt_item
      order_items_inx    
= lt_itemx
      order_partners     
= lt_partner
      order_schedules_in 
= lt_schedules
      order_schedules_inx
= lt_schedulesx
      order_cfgs_ref     
= lt_sales_cfgs_ref
      order_cfgs_inst    
= lt_sales_cfgs_inst
      order_cfgs_value   
= lt_sales_cfgs_value
      order_cfgs_vk      
= lt_sales_cfgs_vk
.


It will very helpful for those, who are struggling with Variant Configuration while creating a Sales order using BAPI.


Thanks,

Harikrishna          

2016 Outlook for Brazil Business-to-Government Compliance: Audit Surge Anticipated Amidst Changing Standards

$
0
0

When Brazil announced its e-invoicing mandate in 2008, such strict and comprehensive regulatory measures were unprecedented. However, Brazil set the stage for business-to-government compliance initiatives, with 10 countries in Latin America plus others worldwide now enforcing similar measures. As we approach 2016, what should companies operating in Brazil know? Here’s a look ahead.

Audits expected to increase

Nota Fiscal (NFe) has now been enforced in Brazil for five years, giving the government ampleamounts of financial data from companies operating there. With all of this information, the Secretaria de Estados da Fazenda (SEFAZ), Brazil’s tax authority, has more opportunities than ever to identify errors and discrepancies, and has garnered valuable insights that will help it to better detect fraud and omissions. Armed with all of this data, expect Brazil to ramp up audits and penalties this year.

Inventory management now affected

Brazil’s Block K mandate goes into effect on February 1, 2016, requiring companies to submit monthly inventory and production reports. This initiative represents a significant challenge to manufacturing, inventory management, supply chain and accounting teams, requiring fundamental changes to operational processes. Specifically, companies have to report details on each and every raw material or component used in a product, including inventory movement, components used/lost, finished products manufactured and more – information that is lacking in many current cost accounting structures.

With the enforcement of Block K, SEFAZ will now have full visibility into a product’s life cycle, from material orders (through Inbound Nota Fiscal) to production (through Block K) to sales (through Electronic Nota Fiscal). Inconsistencies will result in fines, penalties and even operational shut downs.

eSocial implementation ramps up

Under the eSocial mandate, employers throughout Brazil are responsible for submitting all labor, social security, tax and fiscal information related to hiring and employment practices. This includes wages, hiring and contract details, warnings and suspensions, medical leave, etc. When fully implemented, all personnel information will be transmitted online, giving multiple government agencies information pertinent to their inspection scope. Much like Block K does for inventory management, eSocial requires a significant shift in the way companies document and process information on their labor force.

With its e-invoicing mandates setting the standard for similar regulations worldwide, Brazil has realized significant increases in its tax revenues - $58 billion in 2012 alone. In 2016, expect Brazil to increase its revenues even further with enhanced audits and changing standards that affect even more business processes. For a full look at what to expect in Brazil in 2016, download our 2016 Brazil Checklist of Mandates.

Rosters End user Manuals

$
0
0

Dear Experts,

 

I have configured the Rosters for Promotion Process and generated the rosters points as well. But I'm stuck up in front end part.

 

Could some one please send me the step by step process in front end or user manual for the same as its urgent.

 

Thanks in advance!!

 

Regards,

Praneeth

Interesting: Neste, Veolia and Borealis to build a new combined heat and power plant

$
0
0

Neste, Veolia, and Borealis have agreed to create a joint venture company to build a new combined heat and power plant and produce and supply steam and other utilities to Neste’s refinery and Borealis’ petrochemical plant in Porvoo, Finland.

 

The company, Kilpilahti Power Plant Limited (KPP), will be owned 40% each by Neste and Veolia and 20% by Borealis. Neste will contribute its required equity share in KPP by transferring the current power plant to the joint venture company. The arrangement is subject to the finalisation of the financing agreements.

Agreements are expected to take place during 1Q16.

 

“We have combined forces to build this modern power plant, and from 2018 onwards its customers will have a more secure, efficient and clearly cleaner power source,” commented Matti Lehmus, EVP Oil Products at Neste Corp.

 

 

KPP will build four new steam and power generation assets with an installed capacity of 450 MW thermal and 30 MW electrical running on side streams from the refineries as well as natural gas, with asphaltene being the main fuel. The total investment is approximately €400 million, of which an estimated €350 million for the new power plant. The resulting new power plant will comply with the latest environmental regulations, including the European Commission’s Industrial Emissions Directive (IED), and is expected to be in operation during 2018. Read the full article >


Stock taking using HANA and simplified UX

$
0
0

Can you imagine a task that one has to carry out regularly, which is supposed to help the business to maintain its accounts, time, cost correctly and increase profitability.

 

Filling up of timesheets?

 

Yes.  But, what if the business complains that carrying out this task itself is not easy and it consumes lot of time, cost and lose revenue?

 

And, if the task is legally mandated?

 

Let's take an example of Warehouse Stock Taking now.

 

Customers who perform stock taking in their warehouses in order to maintain the stock accounts accurately, also faces similar issues.  Lets say, time consumed for stock taking in a warehouse is 2 working days.  If they perform this activity every quarter, that means, every year 8 working days are spent in stock taking per warehouse.  If there are 100 warehouses, then total number of working days required for stock taking is 800 days per year!  This is huge number.

 

The worst part is:  The warehouse business has to be shut down during stock take, because, there should be no goods movement for sales, and billing will be lost for 800 working days.  There are chances that the customer could also lose their business to competition.

 

Imagine the humongous loss that it makes every year, given that the customer has an enviable reputation for professionalism and customer centricity.  To keep up the thousands of invoices it issues each day, and to ensure the customers get the right products in the right locations at the right time, the customer has to somehow overcome this business challenge.

 

Though, it is said that that the business has to be stopped in order for the stock take to be efficient, why don't we think of some out-of-box processes that will remap the current stock taking is performed?  Something that connects people, devices and inventory together; something that can accelerate the execution and the process steps in between the stock counting; something that can talk to the sensors and speed up the inbound and outbound goods movement signals to action and have the most user friendly screens.  In short, a very promising value proposition.

 

If the stock taking is performed without (or at least for minimum time) blocking the inventory, there will be minimum business loss.  In this case, we can think of a simple and comprehensive fiori like UI screen to record the count along with date/timestamp and store it in a custom table in HANA Db.  Let the goods movement (GR/GI) also be allowed to be posted at the same time.  There is no need of inventory blocking at this stage and this will allow the business to continue.  The date/timestamp will be recorded for each posting transaction.  Once the counting is performed, a custom built powerful reconciliation program could be used to determine the exact stock count by comparing the timestamp of the goods movement and that of the count taken.  The report could be also used to track and analyze stock differences and check various issues, powered by in-memory architecture, and then post/clear off any stock deviations from the WM/IM system.

 

Such solutions can be implemented to tackle problems like loss of business during stock take, perhaps in any warehouses across industry lines.  The mobile apps for counting stock will also help in faster stock taking process.

 

Likewise, there could be many such business cases to improve the existing business application with speed/UX.  SAP is ready for this digital revolution with the right set of tools and technology!

BSI TaxFactory Cyclic J and the New RFC Wrapper

$
0
0

No doubt by now you've seen some announcements regarding the end of support for the "classic" RFC Library and how this impacts BSI TaxFactory. As a BSI customer, you probably received an email from BSI Support about this. Either way, if you haven't already, I encourage you to have a look at Note 2219445 (BSI TaxFactory wrapper code redesign - old RFC Library going out of maintenance).

 

You've probably also seen some discussions around the subject of just how to obtain and install or upgrade this new RFC Library, and if you're like me, perhaps even been a little confused as to just what you need to do.

 

Take heart! It's not that difficult. At the core of it, this is just another Cyclic Update, with only a couple easy extra steps. The basic process for applying a Cyclic Update has not changed from the procedure I described in BSI TaxFactory 10 Cyclic Update. Here I'll just go over the few extra steps.

 

Remember the deadline for maintaining support: 31 March 2016. So plan on getting this through your DEV and QAS track and into production before that date.

 

Prerequisite SAP Note

Before you update TaxFactory to Cyclic J, it is necessary to apply Note 2242290 (BSI: Changes for Cyclic J of BSI TaxFactory 10.0) to your ECC system. This is due to some changes in the structure of the data that TaxFactory sends back to ECC once this Cyclic is applied. Note, it's perfectly fine to apply this Note at any time before you apply the Cyclic, as it is backwards-compatible with earlier Cyclics.

 

 

Apply the Cyclic

Now apply Cyclic J to your TaxFactory system using the same standard procedure outlined in BSI TaxFactory 10 Cyclic Update. Only a few items are different from the update to Cyclic F described in that blog.

 

TF10 Client

During the TF10j client update, you may notice the message:

 

Querying Tomcat7-PRD failed (0). Waiting...

 

Do not be alarmed! This is normal. What is happening here is that the installer is stopping the Tomcat service and is querying to determine when it has stopped before going on to delete files. The query "fails" until it detects a successful stop, which may take a moment or two. Once the service stops, the query succeeds and the installer proceeds. Otherwise, this part is the same as previous TF10 client cyclic updates.

 

TF10 Server

Again, this process is mostly the same, but here there is one additional step to take. Also, you might find it necessary to shutdown the SAP Gateway (via SAPMMC) before copying the new executables into the working folder, as the gateway process may hold a lock on the tf10server.exe file.

 

You will notice two new files that didn't exist in prior cyclics: tf10server_new.exe and tf10serverdebug_new.exe. These are the versions that use the new NetWeaver RFC Library. By default, BSI is providing them as optional files, while the default files still use the Classic RFC Library -- the one that is being deprecated.

 

However, we don't want to use the old library, we want the new one, so the procedure is to rename the files:

 

  1. Rename tf10server.exe and tf10serverdebug.exe to tf10server_old.exe and tf10serverdebug_old.exe, respectively.
  2. Rename tf10server_new.exe and tf10serverdebug_new.exe to tf10server.exe and tf10serverdebug.exe, respectively.

 

 

Cyclic Data File

Don't forget, to launch the new client in your browser, you must edit your favorite or shortcut URL. The new URL will look something like this:

 

http://<TFhost>.<domain>.com:8091/eTF10j/PRD

 

Notice the part in bold, where the letter indicator of the Cyclic is included. That's the part you must change.

 

Load the cyclic data file as you would normally. You will notice a new feature in the client for monitoring the manual load status, which is a nice addition.

 

New RFC Library

Here comes the fun part. At this point, nothing works. Well, ok the TF client works, but you don't have any working communication between ECC and TaxFactory, and not just because you haven't restarted your Gateway yet. If you're like me, you probably assumed that your fancy 7.42 Gateway would have the new RFC functions embedded. Nope. Then you might assume that you can use the 7.42 version of sapnwrfc.dll, etc. Maybe, but I had a devil of a time trying to make that work.

 

You might assume the readme.txt that came with the new Cyclic J server executables would have detailed installation instructions. Not really. It just says to put the new libraries in the directory where tf10server.exe is running, but doesn't say which libraries. Note 1025361 (Installation, Support and Availability of the SAP NetWeaver RFC Library), which everything points to, does give some hints, but it isn't very explicit in its instructions. Indeed, as it kept pointing to a 7.20 version of the RFC Library, not to mention a full SDK, and it seemed that a 7.42 version was available, I figured the Note must be out of date.

 

I spent more than a day or two trying to make it work with a 7.42 version of sapnwrfc.dll, fruitlessly.

 

There might indeed be a way to make the 7.42 dlls work, but for our purposes here you do in fact require the 7.20 version mentioned in the Note.

 

Download the RFC Library

To obtain the correct library, open up Support Packages and Patches | SAP Support Portal, and under Software Downloads... Support Packages and Patches select Browse Download Catalog. Navigate to Additional Components... SAP NW RFC SDK... SAP NW RFC SDK 7.20... <server OS platform, e.g. "Windows on x64 64bit"> and select the latest patch level of the NetWeaver RFC Library (at this writing, it is patch 38 published on 11/13/2015). Download the file and unpack it with SAPCAR as you normally would.

 

Install the RFC Library

The installation is simple. You simply copy the appropriate files to the appropriate folder. To find the files you need, drill into the folder you unpacked to \nwrfcsdk\lib and copy all the files you find there.

 

 

Paste these copied files into your working TaxFactory server folder (e.g. \BSI\TaxFactory\server), where your tf10server.exe executable resides.

 

 

That's pretty much it!

 

You can restart your Gateway via SAPMMC now.

 

Test and Sync

Time to test that it works. A connection test to BSI10-US-TAX via SM59 should be successful. Good old RPUBTCU0 via SA38 should be successful. And HRPAYUS_SYNC_TAX_DT should correctly report the new Cyclic and appropriate Regulatory update for Level in BSI Client.

 

However, you still don't have Cyclic J in your ECC system. For that you'll need to run a sync, even if your Regulatory levels already match. And, as usual after a sync in DEV, you'll need to create both cross-client and client-specific transports to migrate the new status to QAS and PRD.

Will the fourth industrial revolution improve the world?

$
0
0

Technology and the Internet make it a lot easier to wage wars in unconventional ways. From 3D printed weapons to genetic engineering in hidden labs, destructive tools across a range of emerging technologies are becoming readily available. Developments like these raise a number of questions for visionaries like Espen Barth Eide, a member of the Managing Board at the World Economic Forum.

 

According to Eide, we need to address inequalities that can lead to global resentment, ethical issues about the use of technology and the education of people worldwide to embed new norms and values in society and provide more opportunities for growth. That’s why the brightest minds from government, business and civil society meet annually to tackle big issues together in the collaborative ‘Spirit of Davos.’

 

The Future of the Internet

 

The World Economic Forum has been convening annually for 45 years. Its mission is to improve the state of the world through public/private cooperation.

On the bright side, technology and the Internet play a key role in achieving the 17 global goals outlined by the United Nations for a sustainable future. Hyper connectivity and the IoT are driving a new cycle of global economic activity focused on sustainable solutions that could end our dependence on fossil fuels, for example. Now, technology can help us reduce waste and redesign production and consumption systems to be more resource efficient. The Internet enables online learning, instant communication and a world of opportunity for sharing knowledge and best practices.

 

The Challenge of Employment

 

As we change technology, it changes us! The way we live and work and interact with each other and the things around us will never be the same. With software dominating the world and machines and robots doing the work of humans, we are entering an era where jobs become obsolete much faster than new ones are being created. Technology is threatening jobs that previously were considered ‘safe’ from replacement by machines. Consider the impact of self-driving vehicles on professionals from taxi drivers to airplane pilots. Even pizzas are being delivered by drones!

 

Certainly, new products and processes will lead to new growth. But change does not happen at the same pace everywhere. Societies that are technologically advanced will profit more while others will lag behind. How can we ensure that everyone will have the right education and skills to benefit from these exciting developments?

 

In the vast timeline of history, we are in the fourth phase of the Industrial revolution. In past centuries we moved from manual to mechanical to mass production. With the advent of electronics and IT we moved into the age of automated production, and we are now entering the world of cyber systems.

 

The Challenge of Food and Agriculture

 

The greatest challenge of this age is disruption. On all levels – social, economic and political – we are facing digital divides. Global warming, for example, has dramatically changed how we produce and consume food.

 

Over the next 30 years, the demand for food will outstrip the supply. Widespread drought and access to water is impacting the production and the price of food. Practices such as deforestation and single crop farming are destroying our food chain and impacting the planet’s biodiversity. And finally, how safe is our food? The demand for cheap food in large volumes leads to substitution of cheap ingredients or non-food elements into the food chain.

 

Digital transformation is improving lives

 

With the right strategy and technology, SAP is helping to create solutions to these complex, difficult problems. A member of the World Economic Forum for years, SAP has played a proactive role in the public/private collaboration on the journey to improve lives. This year SAP CEO Bill McDermott was invited to join the fight against cancer in a taskforce headed by US Vice President Joe Biden.

 

That’s not all! Click here to find out how SAP is helping customers, organizations and individuals achieve the 17 Goals for a Sustainable Future. You will be awed and inspired by the stories and examples of technology and people changing the world for the better!

  

@magyarj

SAP Multichannel Foundation for Utilities and Public Sector(MCF) Explained!

$
0
0

Since its launch on Nov 2013, SAP MCF has been widely adopted by customers across the globe and continue to grow as a solution with several releases since then. So what is SAP MCF?. Simply put, SAP MCF lets the Utilities in a simple and cost effective way, to digitize their customer interactions through self-service digital channels like web, mobile and social. It does so by abstracting the complexity of the SAP Business Suite for Utilities into simple open standard Rest based APIs/OData services that can be easily consumed by consumer facing apps-on any channel, built using any technology.

 

MCF1.jpg

Click on the picture to enlarge


The following things should come to your mind when you think SAP MCF.

 

Apps

 

3 template apps that can be deployed for use by end consumers are delivered as part of MCF. A standard desktop web app, a responsive design app that dynamically renders itself to device form factor ( you could think of it as mobile optimized web page) and a native mobile app for Android and iOS. Technically,the apps are built with SAP UI5/HTML5. The interesting thing is that all the apps use the same MCF Rest based APIs/OData services, hence no more costly app-per-app integration required and what's even better is that the customer gets to see same data irrespective of the app he/she is using.

 

MCF2.jpg

 

Click on the picture to enlarge


55+ per-integrated Self services scenarios

Working with our customers, we have identified the most important self services scenarios and delivered them as part of the standard solution. A scenario is delivered as a single or  a combination of Rest based API/OData services that are fully integrated to the SAP Business Suite for Utilities. The scenarios range from simple ones like view bills, change personal data to more advanced like view smart meter consumption, signing up for a new product and submitting a meter reading anonymously. So lots of content that can be used out-of-the-box, have a look below.

blog1.jpg

 

Click on the picture to enlarge


blog2.jpg

 

Click on the picture to enlarge


Platform

While SAP MCF is an out of the box solution offering ready to use self-services scenarios and apps, it is very much also a high productivity platform that makes it easy to enhance the standard or develop new scenarios. It is powered by SAP Gateway and SAP Mobile Platform that enable enterprise grade security, scalability and enhanceability. SAP Gateway where OData services are build and consumed, is further complemented with a framework to make it easy to enhance or add to the Utilities OData services. The MCF mobile app runs on SAP Mobile Platform which in addition to providing a mobile infrastructure, also contains various tools to enhance existing or create new mobile apps.

 

No Upgrade required

SAP MCF is delivered as a set of 3 non-modifying add-ons, one each to be deployed on ISU, CRM and Gateway respectively.As long as min release level requirements- ISU 6.04 and/or CRM 7.0 are met, its a simple add-on installation without any need for upgrade.This means that you can go up and running with MCF with minimum disruption to your existing running business systems. Also to be noted that CRM is optional and MCF can also be deployed with ISU only installations.

 

Misc

If all the above is still not enough ,  MCF comes with more features like own user management, channel analytic, 360 customer view, push notifications, co-browsing for agent and customer.

 

Keep exploring the MCF page for more interesting information. I look forward to your comments.

SAPUI5 and Twitter API

$
0
0

Hi,

 

This blog will show how to use the Twitter REST API in a SAPUI5 application.  In this example I will use the Twitter Search API (search/tweets.json).

All other API services can be found here: API Console Tool | Twitter Developers. (Log on with your twitter account)

 

Prerequisites:

 

- create a Twitter application (Twitter Application Management)

- retrieve: Customer Key, Customer Secret, Access Token and Access Token secret.

 

Creating a signature

 

This explains how to generate an OAuth 1.0a HMAC-SHA1 signature for a HTTP request. This signature will be suitable for passing to the Twitter API as part of an authorized request.


first we have to declare some variables:

 

var customer_key = "a5IGWlSrrLyI7GEPV5ZMY1MiP";
var customer_secret = "Ey9xYIZwSPfuMZhS4IItTbWB4hXQD7resrPJpYbGc4PeMJBRdp";
var access_token= "2744465306-RGKnQQl5l5p1bZBNlUqbYrEROBqvZx6ij5ZIOos";
var access_token_secret = "pQ2TvVjUOeqPE0Eyk7OekzKIPGzcDvTNBZuUmAZiH9awd";                                        
var signing_key = customer_secret + "&" + access_token_secret;                                        
var signature_method = "HMAC-SHA1";
var authorization_version = "1.0";
var method = "GET";
var protocol = "https";
var server = "api.twitter.com";
var version = "1.1";
var service = "search/tweets.json";


STEP 1: Create base url:

 

The base URL is the URL to which the request is directed, minus any query string or hash parameters.

 

var BaseURL = protocol + "://" + server + "/" + version + "/" + service;


STEP 2: Collect your parameters:

 

Next, gather all of the parameters included in the request.


var callback = "callback=twitterCallback";
var count = "count=" + this.count;
var language = "lang=" + this.lang;
var oauth_consumer_key = "oauth_consumer_key=" + customer_key + "&";
var oauth_nonce = "oauth_nonce=" + this.makeid() + "&";
var oauth_signature_method = "oauth_signature_method=" + signature_method + "&";
var oauth_timestamp = "oauth_timestamp=" + Math.floor(Date.now() / 1000) + "&";
var oauth_token = "oauth_token=" + access_token + "&";
var oauth_version = "oauth_version=" + authorization_version + "&";
var query = "q=" + encodeURIComponent(oEvent.getParameter("query"));
var result_type = "result_type=" + this.resultType;


STEP 3: create authorization parameter string and search options:

 

var oauth_parameters = oauth_consumer_key + oauth_nonce + oauth_signature_method + oauth_timestamp + oauth_token + oauth_version;
var searchOption = query + "&" + count + "&" + result_type;


STEP 4: Create parameterstring:

 

this is a very important step: concatenate all parameters alphabetically:

 

var parametersString = callback + "&" + count + "&" + language + "&" + oauth_parameters + query + "&" + result_type;


STEP 5: Create signature string:

 

  1. encode the complete parameter string
  2. encode the base url
  3. append the method. (method in upper case!!!)

 

var signatureBaseString = method + "&" + encodeURIComponent(BaseURL) + "&" + encodeURIComponent(parametersString);

 

it should look like this:

 

GET&https%3A%2F%2Fapi.twitter.com%2F1.1%2Fsearch%2Ftweets.json&callback%3DtwitterCallback%26count%3D100%26lang%3Dnl%26oauth_consumer_key%3Da5IGWlSrrLyI7GEPV5ZMY1MiP%26oauth_nonce%3DoJV2FF2PtC%26oauth_signature_method%3DHMAC-SHA1%26oauth_timestamp%3D1453384662%26oauth_token%3D2744465306-RGKnQQl5l5p1bZBNlUqbYrEROBqvZx6ij5ZIOos%26oauth_version%3D1.0%26q%3Datos%26result_type%3Dmixed



STEP 6: Calculating the signature:


In this step we create the authorization signature.

 

Therefore, I use 2 scripts from CryptoJS ( crypto-js -   JavaScript implementations of standard and secure cryptographic algorithms - Google Project Hosting )

 

 

I downloaded them to my project folder and use them in my application:

 

jQuery.sap.require("TwitterSearch_v002.util.HMACsha1");
jQuery.sap.require("TwitterSearch_v002.util.EncodeBase64");

 

First I used:

 

jQuery.sap.includeScript("http://crypto-js.googlecode.com/svn/tags/3.1.2/build/components/enc-base64-min.js");
jQuery.sap.includeScript("http://crypto-js.googlecode.com/svn/tags/3.1.2/build/rollups/hmac-sha1.js");

 

but I get the message:

 

Mixed Content: The page at 'https://webidetesting1208672-s0009219687trial.dispatcher.hanatrial.ondemand…onentPreload=off&origional-url=index.html&sap-ui-appCacheBuster=..%2F..%2F' was loaded over HTTPS, but requested an insecure script 'http://crypto-js.googlecode.com/svn/tags/3.1.2/build/components/enc-base64-min.js'. This request has been blocked; the content must be served over HTTPS.t @ sap-ui-core.js:88

sap-ui-core.js:88 Mixed Content: The page at 'https://webidetesting1208672-s0009219687trial.dispatcher.hanatrial.ondemand…onentPreload=off&origional-url=index.html&sap-ui-appCacheBuster=..%2F..%2F' was loaded over HTTPS, but requested an insecure script 'http://crypto-js.googlecode.com/svn/tags/3.1.2/build/rollups/hmac-sha1.js'. This request has been blocked; the content must be served over HTTPS.

 

getting the signature:

 

var hash = CryptoJS.HmacSHA1(signatureBaseString, signing_key);
var base64String = hash.toString(CryptoJS.enc.Base64);
var oauth_signature = encodeURIComponent(base64String);


STEP 7: Build the url

 

ok, so now we have averything to build our url to call the API:

 

concatenate as follow:

 

var URL = BaseURL + "?" + searchOption + "&" + oauth_parameters + "oauth_signature=" + oauth_signature + "&" + language + "&" + callback;

 

So now we have the url to retrieve the data.

 

Getting the data

 

I use JSONP to retrieve the data via a callback function.  The Twitter API supports the use of JSONP. In the beginning we have added the parameter "callback" to the parameter string. The name of the function can be chosen.  In this example I use "twitterCallback". Don't forget to add this parameter in the parameter string!!

 

first you have to inject the script in the header of the page:

 

var socialGetter = (function() {  /* just a utility to do the script injection */  function addScript(url) {  var script = document.createElement('script');  script.async = true;  script.src = url;  document.body.appendChild(script);  }  return {  getTwitterTweets: function(url) {  addScript(url);  }  };  })();

 

then, you have to define the callback function:

 

                                                window.twitterCallback = function(data) {                                                                if (data) {                                                                                var twitterResult = new sap.ui.model.json.JSONModel();                                                                                twitterResult.setData(data);                                                                                sap.ui.getCore().byId("__xmlview0").setModel(twitterResult, "twitterResult");                                                                }                                                };

 

notice that I use "window.twitterCallback". this is because a callback method is always a global function, so you have to declare a global function.

 

with this code:

 

 

sap.ui.getCore().byId("__xmlview0").setModel(twitterResult, "twitterResult")

;

I retrieve the search view and set the model "twitterResult".

 

ok, so now we can call the url:

 

socialGetter.getTwitterTweets(URL);

 

View

 

In the view I use a list of objectListItems:

 

<List busyIndicatorDelay="{masterView>/delay}" growing="true" growingScrollToLoad="true" id="list"  items="{ path: 'twitterResult>/statuses', sorter: { path: 'accountID', descending: false }, groupHeaderFactory: '.createGroupHeader' }"  mode="{= ${device>/system/phone} ? 'None' : 'SingleSelectMaster'}" noDataText="{masterView>/noDataText}" selectionChange="onSelectionChange"  updateFinished="onUpdateFinished">  <infoToolbar>  <Toolbar active="true" id="filterBar" press="onOpenViewSettings" visible="{masterView>/isFilterBarVisible}">  <Title id="filterBarLabel" text="{masterView>/filterBarLabel}"/>  </Toolbar>  </infoToolbar>  <items>  <ObjectListItem icon="{twitterResult>user/profile_image_url}"  intro="{twitterResult>user/name} - {twitterResult>created_at} - {twitterResult>user/location}" press="onSelectionChange"  title="{twitterResult>text}" type="{= ${device>/system/phone} ? 'Active' : 'Inactive'}"></ObjectListItem>  </items>  </List>

 

an example of a tweet looks like this:

 

 

"statuses": [   {  "metadata": {  "iso_language_code": "en",  "result_type": "recent"   },  "created_at": "Thu Jan 21 14:47:27 +0000 2016",  "id": 690183933741310000,  "id_str": "690183933741309954",  "text": "Throwback Thursday    // NYC shoot with elite_e46 for @pbmwmagazine // #meatyflush #bmw #bmwusa… https://t.co/7w1x7293Sf",  "source": "<a href="http://instagram.com" rel="nofollow">Instagram</a>",  "truncated": false,  "in_reply_to_status_id": null,  "in_reply_to_status_id_str": null,  "in_reply_to_user_id": null,  "in_reply_to_user_id_str": null,  "in_reply_to_screen_name": null,  "user": {  "id": 2418867279,  "id_str": "2418867279",  "name": "MeatyFlush",  "screen_name": "MeatyFlush",  "location": "DMV",  "description": "We are a photography collective who's sole purpose is to capture the car scene featuring some of the hottest and greatest automobiles in the DMV.",  "url": "http://t.co/lk0XSiqxNR",  "entities": {  "url": {  "urls": [   {  "url": "http://t.co/lk0XSiqxNR",  "expanded_url": "http://meatyflush.com",  "display_url": "meatyflush.com",  "indices": [  0,  22   ]   }   ]   },  "description": {  "urls": []   }   },  "protected": false,  "followers_count": 57,  "friends_count": 14,  "listed_count": 14,  "created_at": "Tue Mar 18 01:02:57 +0000 2014",  "favourites_count": 19,  "utc_offset": null,  "time_zone": null,  "geo_enabled": false,  "verified": false,  "statuses_count": 1552,  "lang": "en",  "contributors_enabled": false,  "is_translator": false,  "is_translation_enabled": false,  "profile_background_color": "C0DEED",  "profile_background_image_url": "http://abs.twimg.com/images/themes/theme1/bg.png",  "profile_background_image_url_https": "https://abs.twimg.com/images/themes/theme1/bg.png",  "profile_background_tile": false,  "profile_image_url": "http://pbs.twimg.com/profile_images/519541485772214272/aKA7PMCO_normal.jpeg",  "profile_image_url_https": "https://pbs.twimg.com/profile_images/519541485772214272/aKA7PMCO_normal.jpeg",  "profile_banner_url": "https://pbs.twimg.com/profile_banners/2418867279/1395120295",  "profile_link_color": "0084B4",  "profile_sidebar_border_color": "C0DEED",  "profile_sidebar_fill_color": "DDEEF6",  "profile_text_color": "333333",  "profile_use_background_image": true,  "has_extended_profile": false,  "default_profile": true,  "default_profile_image": false,  "following": false,  "follow_request_sent": false,  "notifications": false   },  "geo": null,  "coordinates": null,  "place": null,  "contributors": null,  "is_quote_status": false,  "retweet_count": 0,  "favorite_count": 0,  "entities": {  "hashtags": [   {  "text": "meatyflush",  "indices": [  70,  81   ]   },   {  "text": "bmw",  "indices": [  82,  86   ]   },   {  "text": "bmwusa",  "indices": [  87,  94   ]   }   ],  "symbols": [],  "user_mentions": [   {  "screen_name": "PBMWmagazine",  "name": "PBMW",  "id": 169887768,  "id_str": "169887768",  "indices": [  53,  66   ]   }   ],  "urls": [   {  "url": "https://t.co/7w1x7293Sf",  "expanded_url": "https://www.instagram.com/p/BAzhLBmPkIE/",  "display_url": "instagram.com/p/BAzhLBmPkIE/",  "indices": [  96,  119   ]   }   ]   },  "favorited": false,  "retweeted": false,  "possibly_sensitive": false,  "lang": "en"   }

 

the output looks like this:

 

Clipboard01.jpg

 

Enjoy!!

 

bye

Viewing all 2548 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>