Quantcast
Channel: SCN : Blog List - All Communities
Viewing all 2548 articles
Browse latest View live

Creating a Mobile App in Hana Cloud Platform Mobile Services (HCPms)

$
0
0

Primary Author

Jack Ross, Software Engineer, WillowTree, Inc.®

 

Contributing Author

Andrew Carter, Senior Software Engineer, WillowTree, Inc.®


Hana Cloud Platform mobile services (HCPms) is a powerful new tool released by SAP to allow for quick extension and management of SAP components to mobile devices. HCPms offers many tools to ease the development process including:

 

  • Multiple authentication methods
  • Secure access to on-premise and cloud-based systems
  • One-to-One and One-to-Many Push Notifications
  • Data synchronization
  • Remote logging

 

The steps to create an app in HCPms take about 15 minutes. To get started, you need to create a Hana Cloud trial membership. Once you have registered and logged in, you will be directed to the Hana Cloud Platform Cockpit. Select Services in the left hand panel, then navigate to Mobile Services.

sap-hana_blog-post-image1_JR.png

Mobile services are disabled by default. To enable, select the Mobile Services Tile, and click enable.

sap-hana_blog-post-image2_JR.png

The service will take a few minutes to start up. In the case of a “failure starting service” warning, simply wait a few minutes, and click enable again. Once the status changes to Enabled, select Go To Service. This will take you to the Hana Cloud Platform Mobile Services Cockpit.

sap-hana_blog-post-image3_JR.png

In the Mobile Services Cockpit, click the Create Application button and fill in the form information. For iOS apps, select Native type. If you want to require user sign-on to access the SAP data, select the appropriate Security Configuration (see picture below).

sap-hana_blog-post-image4_JR.png

Save the application and you are ready to create your first HANA Cloud mobile application. For authentication and access, you will need the Application ID and authentication data (account and password). For trial accounts, you only have one user account and the credentials are the same as the credentials used to login to HANA Cloud.

 

To start authenticating from a mobile client be sure to read the tutorial we'll be publishing tomorrow, SAP Mobile Application Tips: SCM Authentication Without MAD.


Sappi (paper) improves Production Costs and Profitability with Real-Time Visibility

$
0
0

First published in SAP Insider

 

Sappi Limited has grown to become a leading global producer and supplier of dissolving wood pulp, paper pulp, and paper products — with an expanded system landscape to match. Its Southern Africa division recognized an opportunity to improve efficiency by consolidating the data spread across its multiple manufacturing systems. Learn how Sappi Southern Africa used SAP Manufacturing Integration and Intelligence to gain increased visibility into its production costs by integrating its manufacturing data.


While going digital is all the rage these days, the paper and pulp industry remains one of the largest industrial sectors in the world. Despite the shift toward paperless communication, demand is on the rise in other areas, such as the demand for packaging materials driven by the continued surge in online shopping and the industry is more competitive than ever due to rising input costs, challenges in obtaining raw materials, and competition from global players.

Headquartered in Johannesburg, South Africa, Sappi Limited is a leading global producer and supplier of dissolving wood pulp, paper pulp, and paper products. Sappi has been in business for nearly 80 years, and has grown to produce roughly 5.7 million tons of paper, 2.4 million tons of paper pulp, and 1.3 million tons of dissolving wood pulp each year at manufacturing sites across three continents.

As the company has grown, so has the system landscape supporting its manufacturing operations. Recognizing an opportunity to improve efficiency and gain increased visibility into its production costs, in 2006, the company’s Southern Africa division sought to systematically bring together the data residing in its numerous manufacturing systems to enable more meaningful reporting.

Sappi Limited case study

 

A Need for Integration


At the time that Sappi Southern Africa began looking for a solution to help integrate its systems and improve reporting, it had eight paper, pulp, and paper packaging mills (Enstra, Ngodwana, Cape Kraft, Tugela, Saiccor, Stanger, Adamas, and Usutu, excluding the Lomati sawmill) in production and running on systems that ranged from legacy and in-house developed applications to best-of-breed systems, with new systems undergoing development.

“We had quite a patchwork quilt of systems and data silos, with each system providing its own set of reports and controls,” says Sappi Southern Africa’s Business Process Engineer Joanne Boyd, who led the integration project team. “What we were lacking was the consolidation of all that data into meaningful information.”

The data silos caused mill employees to waste a lot of time and productivity while gathering and formatting all the data from the different systems into spreadsheets. “While we had basic operational reporting in place, the manual consolidation of the data meant that there wasn’t a whole lot of time left for the business users to analyze the information,” Boyd explains. “Also, we had multiple interpretations of the same data because everybody was doing their own spreadsheets. We had conflicting key performance indicator (KPI) values floating around the organization.”

From a technical aspect, the data silos produced complex webs of interfacing and integration between the systems, and simplification was necessary to keep operations running smoothly and provide meaningful information to the business. It was at this time that Sappi Southern Africa purchased the SAP Manufacturing Integration and Intelligence (SAP MII) application — a solution that integrates plant and enterprise systems to enable unified visibility into manufacturing processes and data — and embarked on a two-step integration project.

 

 

Joanne Boyd, Business Process Engineer, Sappi Southern Africa


 

 

We had quite a patchwork quilt of systems and data silos, with each system providing its own set of reports and controls. What we were lacking was the consolidation of all that data into meaningful information.

 

Step 1: Integrate the Manufacturing Systems and Consolidate Data


The first step was using SAP MII to integrate and consolidate information from the various Sappi Southern Africa manufacturing execution system (MES) data. This horizontal integration consisted of the consolidation of all the manufacturing data from the various systems in the MES arena — beginning with historical information, which included data from previous shifts, previous days, month-to-date figures, and so on, and then moving on to real-time visibility in the second step of the project — into a single, unified view within SAP MII.

Sappi Southern Africa saw immediate results. “We instantly made information a lot more accessible and digestible,” says Boyd. “Finally, we got to the point where we were able to develop some very basic predictive reporting,” she adds. Still in its infancy at the time, the SAP MII–based reporting capability allowed operators to see how much they were producing during a shift, at what cost, and determine whether they were on track to meet daily targets.

After implementing the horizontal integration at the first mill, Sappi Southern Africa followed the same recipe while rolling out SAP MII at each remaining mill. And since the project team developed a generic reporting toolset — rather than developing reports specific to a particular mill — the team could drop these toolsets into any mill with minimal configuration, enabling rapid deployment. Operators could then quickly and easily gain insight into information that included downtime, production, and quality for the day.

“The user-adoption rate was great,” Boyd says. “All of a sudden, users had information available at their fingertips. While we started by implementing basic production dashboarding and reporting, this soon gained momentum and more requests started to come through from our users.”

Once the users realized what the system could do for them, their requests started to get a little more sophisticated. To take reporting to the next level for users, Boyd and her team worked with the business to learn its biggest pain points and then, a few years later, embarked on the project’s second step: connecting plant floor systems with financial data from the ERP system. This vertical system integration project started with the development of a real-time solution that enabled visibility into the measurement and control of consumptions and costs.

Step 2: Develop a Real-Time Costing Solution


The Sappi real-time costing solution enabled the integration of the actual consumptions of the various bills of material (BOMs) with the budgeted consumptions. In addition, it also provided a cost comparison to budget and actuals. This project has enabled operators and management alike to understand the impact of production problems and decisions — not only from a budget/recipe perspective, but particularly from a financial perspective, and has assisted with variable cost control within the organization.

Operators are able to identify in real time the minute a particular BOM is over consumed, and more importantly understand the financial impact of that consumption. This information is then gathered over time, enabling Sappi Southern Africa to determine the cost of each roll of paper produced. In addition, it provides the opportunity to analyze profitability of various products, performance to budget, recipe management, and even which crews or shifts perform more effectively.

To enable access to this consumption information across systems, Boyd and her team developed a solution in SAP MII that was easily deployed at any mill with very little customization. “The solution is able to integrate ERP information and plant floor information, regardless of what plant floor systems are in place,” she says. “The mill queries are the only part of the solution requiring customization. By pulling the production information, the consumption information, and the costs together, we can provide feedback to users on the physical BOM quantities consumed against budget, and more importantly, we provide detailed cost comparisons against budget right down to each roll of paper produced. This financial perspective enables us to pinpoint opportunities to save money.”

With this integration and consolidation of information, Sappi Southern Africa is able to monitor production costs more effectively. “This visibility has promoted better cost control,” Boyd adds. “Generally, production managers know what they are doing and rely on experience to understand what affects the profitability of the various grades. Now, we are backing this up with real, tangible data and making it more visible to every level of the business.”

On the implementation side, it was smooth sailing for Boyd’s team. “We didn’t have any technical implementation challenges,” she says. “From a technical perspective, we had a strong development team that was proficient in SAP MII. A lot of the intellectual property from the various systems was within our domain, which was a big differentiator.”

The third and final step of the integration project will be to implement enterprise-level reporting. Sappi Southern Africa is ready this year to begin that stage — it is currently drawing up a proof of concept to upgrade from SAP MII 12.2 to SAP MII 15.0. “We are currently reviewing our enterprise reporting strategy,” says Boyd. “And while our SAP MII 15.0 upgrade is imminent, we are presently ironing out the details.”

Is S4 Hana a real innovation or simply a SAP's marketing blurb?

$
0
0

Before the era of computers, we had documents. We had Sales Orders, Purchase Orders or Accounting Document. Likewise, all the business processes are represented by a document. With the invent of computers and Database, these are represented by a table called core table to represent the line item table. These tables had about 200 to 400 fields depend on the industry of record.

 

Whenever there is a change in these documents, the whole record is copied and created a new record. When there are series of changes in a document, there is a necessary to keep track of all the changes in a changelog table. Like who changed it or what changed in it for Audit purposes.

 

In those days, Data Modelers have got an idea to split the table into smaller ones like header, line and schedule line evolving to the concept of normalization. Of course the other tables have fewer fields. But the concept remains the same. If anything changes it copies the whole record and maintains the changes in the Change Log tables of their own. For this reason one business process has so many smaller (20 to 30) tables forming a cluster by having a join between them. Currently this is the process to support traditional databases.

 

On top of this there are database constrains like many to one issues or Join relations, DB administrators have to keep it intact to maintain and sustain it. Also, have locking, latching, deadlock, Paging issues and thus creating lot of performance issues. It has two things, massive table space usage and reading these tables is too slow. When the DB is slow, for all these tables with the method that has to be read by a query, we start adding indexes further creating new tables. On each of these tables, it is not necessarily 1, there could be many indexes. These are copy of the tables with a narrow method of access. For the fact that system is getting slower, we are trying to copy the tables taking yet more space and forcing the system to narrowly usable.  But, it turns out to be indexes are not fast enough. So we have started creating aggregates. These are whole new tables by managing the SAP code build and tested to get like daily/weekly/monthly total invoices, thus getting more complex in managing these SAP codes and customer exits. This is all to get better reporting and speeding up the system. These are the standard Indexes and aggregates. But our great DBA's, with their experiences, they analyze the queries specific to the customers and they build additional custom indexes and aggregates. During the upgrades, service pack updates, and new releases, apart from the work what we have to do, system also has to do lot of work by rolling up the data to all these standard and custom build tables, indexes and aggregates to reflect the changes in the documents and be reportable.


Just a case, if we have to build a dashboard reporting, as a developer, we need to know what business processes change will impact what tables or trigger any change and how it is influencing down the flow. That’s why BW has become significant. SAP has integrated the flow and made a module specific or process specific extractors enabling the single source in BW. BW Extractors took care of this spider web of tables and simplified the logic which makes sense to the consumers.

 

This is only way we have to run the system brilliant and all the traditional databases are being the best to build and sustain the model. Now, let us talk about what different is in S4 Hana and why?

 

Alternatively we have to think of getting away from this complexity.

 

With Suite of Hana, SAP did several things, They were able to drop the need for some of the Indexes and aggregates but the fundamental aspects of all the documents and relations, tables required to maintain them was unchanged in order to have compatibility for customers.

 

But S4, Complete rethinking by leveraging the capabilities of Hana by simplifying it for customers and making it much more agile for development staff. Document concept is still the same. We still have the same record but it is not a row table. It is now represented as individual columnar stores for each of the fields. The big difference is, if the value changes, we just insert one field. We don’t replicate the whole record with the changed values. It is almost impossible to run this for such a 400 field tables in any relational data bases having so many table rows and indexes. We still have the same no of field changes but each field will have its own columnar storage, which means every field can act as an individual index.

 

We are now to 1 table from the spider web of 20 to 30 tables for each business processes. All the complexities I talked above on the table locking and latching is no more an issue.  I have to take care of only one locking issue that might happen during the multiple inserts on the same columnar table. We do not need to create whole new aggregates for a new requirement. We can run on the same table and Hana creates on the fly. No need to create any indexes to speed up the process. We don’t need any different status tables or header tables. So all these nested tables has been collapsed into one single line item document. All the SAP code in maintaining the cluster of tables is now that my understanding is all gone. Now think about the testing and regression required, all of that is radically simplified which means higher quality process. With all those variations, ups and downs, mistakes that can go wrong is traditional databases is tremendous with comparison to when we code against this S4 simpler process.

 

As you read, S4 is a huge innovation leveraging that was not possible before HANA due to both its ability to have a simple system and ability to aggregate on the fly and enabling new code pointing to this simpler structure.

 

Last, but not the least, Traditional Data bases with this complex structure, they are not compatible with Multi-Tenant Cloud. It has to go through a hosted service or an On Premise environment. SAP took care of this issue while upgrading from clustered table structure to new code line of S4 HANA, they designed in such a way that they can run on multi-tenant cloud or hosted or On Premise. It enables customers to go back and forth between these business models. This is not possible with any of the other ERP mates.

 

S4 Hana is enabling lot of business benefits leveraging the advantages from Suite of Hana.

S4 Hana Rocks!

Increasing performance of high-volume data loads via the Migration Workbench

$
0
0

Motivation

 

When uploading a high volume of data into C4C via the Migration Workbench (MWB) performance is always an important aspect. The MWB supports various ways to support high-volume data loads, the most relevant ones being:

  • Bundling: Multiple instances of an object (e.g. multiple customer records) can be uploaded with 1 webservice call
  • Parallelization: Multiple parallel webservice calls can be performed at the same time

 

While the bundling factor is hard-coded per migration object (e.g. for accounts it is 10), the parallelization factor can be set by the key user within the MWB before the actual migration run. If it is not set explicitly, the default is 4. The below section describes how to adjust this factor which can result in a significant performance boost.

 

Configuration

 

1) After accessing the staging area of the MWB, click on "You Can Also" in the top right corner. There you will find the option "Adjust Settings for Parallel Processing"

 

MWB_Parallel1.png

 

2) In the following popup you can adjust the parallelization factor. If you open it the first time you will see the global default, which is 4.

 

mwb_Parallel2.png

 

Raise this factor to increase the number of webservice calls which are executed by the MWB in parallel at any point in time. Note that for the customer upload internal tests have shown that a parallelization factor of 30 results in the best migration performance.

Reimagine the SAP user experience (UX) with SAP Fiori

$
0
0

Personalize and simplify the user experience (UX) for your SAP applications. Using modern UX design principles, SAP Fiori delivers a role-based, consumer-grade user experience across all lines of business, tasks, and devices. Use SAP Fiori apps, or take advantage of new solutions that natively incorporate the UX – such as SAP S/4HANA, Ariba Mobile, and SAP Cloud for Customer.

3 Ways to Create an Internal Business Solution with a Mobile App

$
0
0

The emergence of smartphones, mobile apps, and social media networks, has brought about a huge revolution in the business world. The relationship between these three powerful technological innovations is that most smartphones owners spend a considerable amount of time using social media apps.

iphone-1032779_640.jpg

Royalty-free image

According to a research on mobile behavior, consumers are spending over 85% of their time on smartphones using native applications, but the majority of their time-84%- is spent using just 5 non-native apps they’ve installed from the app store.

A statistic on the number of apps available for download in leading app stores as of July 2015 shows that Android users were able to choose between 1.6 million available apps. Apple's app store remained the second-largest app store with 1.5 million available apps, although Apple announced last year that 100 billion apps had been downloaded from its App Store from July 2008 to June 2015.

From the business perspective, this is raw gold from app developers specializing in mobile app development because most of these apps are not free. In fact, the amount paid to Apple app developers runs into billions, outweighing Google and Windows. However, on a larger scale, businesses also have much more to gain from creating customized apps.

Therefore, if you’re looking for an internal business solution that will help you achieve your business goals, developing an app on iOS, for Apple’s iPhone and iPad is the way to go.

Here are 3 ways your mobile app can help you create internal business solutions.


1. Central management of all devices

With new enhancements to Mobile Device Management (MDM) capabilities, iOS gives your IT team more options to control app devices and data from a central location. This will give more visibility and allows you to manage all devices from a single location. The advantage is that you can select and configure the same settings across multiple devices, install apps directly to several devices at the same time, and provide remote support across the board.


2. Internal app distribution network

With the iOS, corporate users are provided with distribution methods that provide employees with everything they need for their work. Essentially, this includes personalized emails and access to a corporate network. It also ensures that the IT department gets all the necessary facilities required to support the entire network. In addition to this, custom developed apps can be made to serve your business privately, by restricting global access to the application in the app store.


3. Visibility to customers at all times

Apart from the fact that the iOS platform makes it possible to have an internalized system which only the members of your business organization can have access to, it also allows you to maintain a direct interaction with customers. You can take feedbacks, answer their queries, capture their data and use the results to provide improved services. With these vital data, you can also create targeted content like stunning infographics, engaging animated videos and interactive SlideShare presentations.

By using the app and the data generated from it, you can provide great customer experience and help your employees stay productive.

Lastly, the combination of unique features such as enhanced management tools, integrated systems and effective distribution networks continues to make the apple iOS the best platform for business. Hence, it creates the much-needed solution for your internal business needs.

Digital Dose – 10000

$
0
0

Below is a collection of #digital tweets from the last 2 weeks – from my perspective the trends, tips and digital news that’s worth a read whether you’re a casual observer or a fellow CDO.

 

2 Mar 2016 This study by BT Mobile and Oxford University took an in-depth look into the 'dos and don'ts' of tech etiquette and came up with a definitive modern day guide. Read more here

 

2 Mar 2016 Nearly every one of the world's largest technology companies is trying to figure out how to let computers understand human speech, but a Santa Clara-based startup may have just cut its way to the top of the field. Read more here

image1.jpg

 

4 Mar 2016 Digital transformation and disruption have been making waves lately across all industries. To stay relevant, companies, departments, and individuals need to know exactly where business technology is headed. Check out where the future of digital transformation is heading

 

4 Mar 2016 Federal health officials announced a deal Monday that should make digital health records easier for consumers and regulators to access and address safety issues linked with the data. Read more here

 

5 Mar 2016 To say that digital advertising doesn’t work at all because responsive rates are low is erroneous. Check out more

 

image2.jpg

 

8 Mar 2016 It’s historically brought together Democrats and Republicans—but the Apple-FBI fight is starting to align along party lines. Read more here

 

11 Mar 2016 The cyber attacks of the future may be hard to spot, and nations may fight over fiber. Check out how future wars will be fought over digital resources

MRP: Simplification on S/4 HANA

$
0
0

SAP strategy for the digital economy is based on the concept of simplification. The SAP strategy is to accelerate business innovation through radical simplification.

 

This concept was also used on the new SAP S/4HANA, on-premise edition 1511 and several application scenarios available on the ECC were simplified. The idea is to make the S/4 HANA implementation faster and easier, in order to accelerate the return on investment and also make the life of the consultants easier.

 

A complete list of the simplified scenarios is available on SAP Help and it is available on the following link:

 

http://uacp.hana.ondemand.com/http.svc/rc/PRODUCTION/pdfa4322f56824ae221e10000000a4450e5/1511%20000/en-US/SIMPL_OP1511.p…

 

Between those scenarios, some are specific to Material Requirements Planning and changes were made on functionalities that were part of the standard system for a long time.

 

Some of the most relevant changes are the following:

 

1 - Simplified sourcing

 

The source determination carried out by MRP was completely redesigned for internal and external procurement.

For internal procurement, a BOM selection method is no longer available for selection. Now, the BOM is always selected according to the production version, therefore, it is mandatory to have a production version.

For external procurement, the source list is no longer relevant to MRP and it is enough to have a valid info record, in order to have a vendor selected for a planned order or purchase requisition by MRP.

 

2 - Storage location planning

The default option to plan a storage location separately is the storage location MRP area. It means that the option to define on the material master that a storage location will be planned separately or excluded from MRP is no longer available.

With the storage location MRP area, there is a separate planning file entry, which means that the MRP area is planned independently. This is a good thing from a performance point of view, since we don't need to plan the entire material just because a change happened on the storage location planned separately.

Besides that, the storage location planned separately runs a very simple reorder point MRP type, while the MRP are offers almost the same settings available on the material master.

 

3 - Subcontracting

Planning of components provided to the vendor must be now carried out with MRP areas. A separate stock segment for the vendor is not available anymore and the main reason is the performance. It is also now easier to use subcontracting MRP areas, since we simply need to create them on customizing and they will be available for all the materials.

 

4 - Planning file

The old planning file entry tables were replaced by the new table PPH_DBVM, with simplified update logic, consistency and for update performance.

There is also a new report and transactions to create and check the consistency of the planning file.

 

 

Header 1Header 2

9781493213993_267_2d.png

If you want to learn more about MRP Live, Fiori Apps and MRP on S/4 HANA, my new e-book will be published in April.

 

See more details on Running MRP on SAP HANA (SAP PRESS)        - by    SAP PRESS


SAP Integrated Business Planning – Holistic Planning in Short Order

$
0
0

SAP Integrated Business Planning (IBP) enables integrated planning and end-to-end visibility to the supply chain with real-time monitoring and analytics. As of mid March 2016, SAP HANA Integrated Business Planning rapid-deployment solution V2.61 is available as packaged offering: It connects all planning applications of SAP IBP to one integrated, end-to-end business planning process by leveraging the new unified planning area and provides pre-defined contents and services to kick-start every implementation.

 

Integrated End-to-End Business Planning Process

 

SAP HANA Integrated Business Planning rapid-deployment solution supports all your business planning on strategic, tactical, and operational levels in an integrated fashion. This is outlined in the following figure.

 

IBP_FLOW.png

 

Figure 1: End-to-end business planning – integrated, aligned, holistic

 

On a strategic level, you align demand and supply review planning activities with sales and operations planning activities. Input for these activities is taken from external planning tools, e.g., the financial, marketing and sales plan, as well as from other IBP planning applications like the global demand plan and safety stock information. The output of this planning cycle is an executive approved final consensus demand. This plan is taken forward to the operational planning level: Here the global and local demand plans, the safety stock and the sensed demand are adjusted during the respective planning activities. The planning processes can run on a monthly, weekly, or daily basis.

Note that selected planning figures, such as the final consensus demand, global demand, safety stock, and sensed demand can be exported to operational planning and execution systems for further processing (Figure 1, dotted line).

The Supply Chain Control Towers accompanies all these planning activities by providing visibility along all planning phases with real-time monitoring, analytics, alerting and exception management (Figure 1, box at top).

 

Processes Covered With This Rapid-Deployment Solution

 

To facilitate the implementation of the above mentioned SAP IBP scenario, this rapid-deployment solutions offers various pre-configured scope items or processes:

 

  • For sales and operations
  • For supply planning
  • For demand planning
  • For demand sensing
  • For inventory and
  • For the Supply Chain Control Tower

 

The scope items are designed in a way that they can be implemented standalone or in combination with other scope items. This gives you the flexibility to pick and choose what you need to support your current planning activities. Once extended to all scope items, you can fully explore the power of integration because all scope items work on the same set of planning data – the unified planning area.

 

End-to-end Planning Experience Based on a Unified Planning Area

 

All processes shipped with this rapid-deployment solution centrally build on one unified planning area, which is integral part of the underlying SAP IBP application and the backbone of the planning process. The planning area gives you the flexibility to accommodate your planning activities to different time and data granularities: Whereas you may want to perform your sales and operations planning activities on a monthly basis for a combination of product family, customer, and region, your weekly demand planning activities may require a more granular level of planning for individual products per customer. In demand sensing, you typically schedule your planning activities on a daily basis. With the unified planning area in SAP IBP you get a set of reference attributes, master data, time profiles, planning levels, key figures, and versions preconfigured to cover the above mentioned use cases exposing you to a holistic planning experience.

 

IBP_Figure_2.png

 

Figure 2: SAP IBP application areas, unified planning area and rapid-deployment solution – a solid framework to implement and operate SAP Integrated Business Planning

 

Implementation Contents delivered With This Rapid-Deployment Solution

 

This solution comes with configuration guides and test scripts for the above mentioned scope items. Additionally, you may accelerate and facilitate your planning activities with the following contents:

 

  • Planning Views– Planning templates which can be launched from SAP IBP Microsoft Excel add-ons containing one or several worksheets for easy-to-use planning. You may view and adjust planning data, and even simulate planning results.
  • Dashboards– Web UI-based graphical representations of planning data, such as dashboards for the demand review featuring the consensus demand based on sales forecast and annual operating plan.
  • Process Management– Templates with status control guiding your through your planning activities in SAP IBP. Here you may indicate the progress of your individual planning activities, such as the completion rate for your reviews and meetings. Process Management requires integration into SAP JAM:
  • SAP JAM– The integration into a collaborative online platform which allows you to share planning updates with all stakeholders involved in the planning activities.
  • Sample DataCSV files populate your SAP IBP system with sample master and transactional data which help you to run a sample planning scenario.
  • Alerts– Alerts will be triggered based on predefined key figures and alert rules shipped with this solution. They may, for example, indicate capacity overload on your production facilities. Alerts are shown in your Microsoft Excel planning views or as custom alerts in the Alert Monitor, used for exception-based interaction.

 

IBP_Figure_3.png

 

Figure 3: Sample dashboard for S&OP supporting the demand review

 

The deployment of the implementation content come with a rapid-deployment solution service. It guides you through all major project implementation activities, including a kick-off workshop and installation check, the implementation and activation of the solution, process testing, knowledge transfer workshops to key users, as well as support for going live and post go-live.

 

More information


  • 3 minute video: SAP Integrated Business Planning powered by SAP HANA – Solution Overview
  • About this rapid-deployment solution on SAP Service Marketplace
  • Offline-demo showing the end-to-end integrated planning experience, which can be build and test-run with this rapid-deployment solution (requires logon to SAP Demo Store)

Faxing across the digital divide

$
0
0

278045_l_srgb_s_gl.jpg

How many millennials does it take to send a fax? Several, apparently.


Recently the Institute was asked to send an important document via fax. “Fax”, we asked, “aren’t we meant to be a digital institute?” The job was tasked to some young team members, and after searching around the office for advice on how to send a fax from older staff, it was soon discovered that a fax had never been sent from the office before, and the machine did not even work. So, a trip to the local post office and $12 later, the fax was off and there was an audible sigh of relief.

 

This story is a little humorous and makes a fool of millennials, sure, but is there some kind of lesson to be learned from it?

 

In many countries faxes will soon be from the past, as sending documents via email has become simpler and more accessible for many. In other countries, however, faxes are still, and will continue to be, used extensively. For example in Japan where fax machines are a common tool of communication, nearly 100 percent of all companies and 60 percent of private homes have them.

 

The SAP Institute for Digital Government (SIDG) is based in Australia, a highly digital society, but we cannot forget not every country is moving at the same pace as us and there are times when we need to adjust. The recent fax machine example demonstrates this.

 

The digital divide is very real and like most complex problems, will likely always exist despite best efforts to address it. Making uneducated assumptions about the digital uptake in particular countries or population cohorts within a country will almost certainly end in failure. This is because a country’s digital adoption rate takes in a range of factors and therefore doesn’t solely correlate with the fiscal status of a country. The Philippines is an interesting example of this; although around 25% of the country lives in poverty, their digital adoption rate is very high. In fact, they are often referred to as the “texting capital of the world.” There are 114.6 mobile phones for a population of just 100.8 million people!

 

With that in mind, how should we approach the digital divide? If we expect everyone to have the latest technology and only communicate through these means, we run the risk of widening the gap even further. No matter how advanced our technology becomes, it’s redundant if it does not serve its purpose and communicate at our audiences’ pace. Sometimes we need to adjust, take the “outdated” route, and have a quick laugh to ourselves when we realise we’re in fact the ones who struggle to use their technology.

 

When sending the fax we were dealing with a foreign country where fax is quite a common means of business communication. However, the digital divide does not only exist between countries, but also within countries in different cohorts of citizens. This is a significant issue for governments who are promoting a digital first policy, but are held back by citizens who aren’t quite ready for it. Rather than being held back by the minority and designing systems which cater for absolutely everyone, governments should design for the majority to take advantage of the digital economy and manage the exceptions with the view of bringing everyone up to speed when people are capable and ready.

 

A key aim of the SIDG is creating public value for government and its citizens through digital capability insights. If we do not understand that sometimes creating public value means taking a step back from our advanced digital world and looking at the situation through the consumers’ eyes, we have failed.

 

To find out more about the SAP Institute for Digital Government visit http://discover.sap.com/sap-institute-for-digital-government, follow us on Twitter @sapsidg and email us at digitalgovernment@sap.com.

Screenshot 2015-03-26 09.47.54 (2).png

Sharks Foundation Field Trip Days at The Tech Museum in San Jose

$
0
0

Thanks to @SAPsv for the opportunity to volunteer and participate in the Sharks Foundation Field Trip Days at The Tech Museum in San Jose on Friday, March 4.  It was a privilege to be part of a program that provides low-income students and the community access to sports-themed Science, Technology, Engineering and Math design challenge learning activities.  They were able to learn about cyber security in Cyber Detectives, wearable technology in Body Metrics, the engineering design process in Innovations in Health Care and how to creatively solve problems using technology in The Tech Studio. 


It was fun to work with the kids to build structures with poles and rubber bands, build things that fly, and be fascinated by the concept of cyber security and how to prevent cyber attacks.  Not to mention being able to sport my new SAP hockey jersey!


This program supports SAP’s commitment to involving young students in STEM education.  It is incredible that @SAPsv brings employees opportunities like this to engage with local students and show them how fun it is to learn about math and science. Plus, seeing some of the top San Jose Sharks players such as Paul Martin and Chris Tierney was a special treat.


Great fun to be had by all at The Tech Museum.  Certainly priming the pump for the young students to get excited about math and science to put them on a path to a successful career ahead! 


It’s awesome that @SAPsv brings employees opportunities like this to give us the opportunity to engage with local sports teams and share our passions with other @SAPsv employees!


Pamela Dunn

Slacking Off (1 of 3)

$
0
0

(This is part 1 of a 3 part series. See Part 2 and Part 3 for the whole story.)

 

Did you ever have a late night instant message conversation that went something like this:

 

Screen Shot 2016-03-18 at 12.55.42 PM.png

 

It’s no fun to be in that conversation. You know you’re stuck sitting in front of a screen for at least the next 10 minutes. And since it’s your work laptop you know that the corporate internet police won’t even let you browse reddit for cat pictures while you wait for VPN and SAP GUI to load up. More so, you know that whatever this person is yelling about is probably not your fault.

 

I’ve been there, trust me.

 

What if your conversation could look like this, instead:

 

Screen Shot 2016-03-18 at 12.59.54 PM.png

 

Did you notice Commander Data interject in that exchange? More on that later.

 

As nerds our jobs often involve performing routine technical tasks for people who use our systems. Maybe you reset a lot of passwords, check the status of integrations, or respond to a support inbox. You probably have loads of different communication tools at your disposal. Chat, email, carrier pigeons…whatever gets the job done. If someone needs your help they’ll generally find a way to get in front of you. Digitally or otherwise.

 

One of the coolest communication tools I’ve worked with in the last couple years is Slack. It’s got individual conversations, group chats, categories, and anything you’d expect from a team chat tool. It’s quickly overtaken email as my preferred method of talking with colleagues.

 

Except it’s way more than chat. Slack allows you to use pre-built integrations to look at your Jira tasks, GitHub commits, and thousands of other things. What’s even better: you can make your own integrations that interact with its web API. Which makes it the perfect thing to plug into your SAP Gateway to make use of the REST APIs you’ve already created for other purposes.

 

In my next couple posts, I’ll show you how to make exactly what I did above using (nearly) free tools.

 

Slack Setup

If you're not using Slack already, you can get a free account. It's very simple and straightforward. Once you've got an account, follow these steps to set up the Slack end of this chain:

 

  • I set this up as a Slash Command. That's where the "/ask-sap" piece comes from in my chat transcript above. Go here to set a new one up for yourself.
  • On the next screen, choose the command that you want to use, starting with a '/' character. You can use /ask-sap if you want to stay perfectly within the tutorial, since these custom commands are for your own Slack team only.
  • Click the "Add integration" button.
  • Screen Shot 2016-03-18 at 1.36.36 AM.png
  • On the next page, pay close attention to the "Outgoing Data" and the "Token" sections. You may want to copy or screenshot them for reference later.
  • The only thing that absolutely has to be filled-in is the URL field. This has to be an https destination that you own or have control of, and it has to be an endpoint programmable by you. Post 2 of 3 in this series will show you how to set that up in Google App Engine - but you could realistically do it anywhere you have the rights to set up programs to run on a web server.
  • Click "Add integration" at the bottom of the page, filling out whatever else you want to along the way. I suggest at least showing your command in the autocomplete list.

 

What you just did set it up so that Slack will respond to any message that starts with "/ask-sap" by sending an HTTP POST to the URL you provided in the settings. The format of the POST it sends will look like the "Outgoing Data" section that you saw in the setup process. For this demo, the most important pieces are the token and text fields.

 

That's it! You now have a Slash Command available in any of your Slack channels. It won't do anything yet, but that's what we'll set up in the next section.

 

On to Part 2!

Slacking Off (2 of 3)

$
0
0

(This is Part 2 of a 3 part series. See Part 1 and Part 3 for the whole story.)


In Part 1, we got Slack up and running with a Slash Command that will send an HTTP POST to a specified URL endpoint. In this part, I'll show you how to set up a basic Google App Engine web server in Python to respond to this HTTP POST and format a request for SAP Gateway. From Gateway, we'll output the data that the request asks for and send it back to Slack. I will not be exhaustive of all the features of App Engine - this is an SAP blog, after all - but I'll provide sample code, links to how-tos, and some tricks I learned along the way. The amazing thing is that a super basic implementation is only about 40 lines of Python code!

 

 

Setup

  • You'll need a Google account (if you have a Gmail address, you're good to go). I like using an IDE like Eclipse with PyDev installed, but if you are a complete notepad.exe ninja then go for it.
  • You'll need to secure a domain name for yourself, or have rights to one. Google, again, has an easy way to do this.
  • You'll also need to get SSL set up for that domain, which you can do for 90 days free at Comodo.
  • Once you have the cert, you can apply it to your Google Domain like this.

 

Now you're ready to code! The easiest way to set up a project for App Engine is do the play-at-home 5-minute version. This will get you a project set up, the right deployment tools installed, and a project folder ready to go. Try it out, test it a few times.

 

Once you're comfortable with how that works, you can simply replace the code files with code I'll provide below. Note that there are several places in the code where I've put some angle brackets with comments - this is where you'll need to fill in your own solution details. My meager programmer salary won't cover a giant hosting bill because everyone copies my domain/settings and sends all their messages through my server.

 

First, replace the contents of your app.yaml file with this code:

 

application: <your-google-app-id> 

version: 1 

runtime: python27 

api_version: 1 

threadsafe: true   

 

handlers:

- url: /.*   

  script: main.app 

 

 

Very straightforward, not much to comment on here. Just remember to replace the app-id section at the top.

 

Next, create a file called main.py (or replace the contents of the existing one) with this code:

 

import webapp2

import json

from google.appengine.api import urlfetch

 

class SlackDemo(webapp2.RequestHandler):

    def post(self):

        sap_url = '<your-sap-gateway>/ZSLACK_DEMO_SRV/RfcDestinationSet'

        json_suffix = '?$format=json'

        authorization = 'Basic <your-basic-credentials>'

        slack_token = '<your-slack-token>'

        request_token = self.request.get('token')

 

        if slack_token != request_token:

            self.response.headers['Content-Type'] = 'text/plain'

            self.response.write('Invalid token.')

            return

 

        text = self.request.get('text')

        details = {}

 

        if text.find('shout') > -1:

            details['response_type'] = 'in_channel'

            response_text = ''

 

        if text.find('test') > -1:

            rfc_destination = text.split()[-1]

            request_url = sap_url + "('" + rfc_destination + "')" + json_suffix

            headers = {}

            headers['Authorization'] = authorization

            response_tmp = urlfetch.fetch(url=request_url,

                              headers=headers,

                              method=urlfetch.GET)

            response_info = json.loads(response_tmp.content)

            response_text += 'Sensor sweep indicates the following:\n'

            response_text += response_info['d']['Destination'] + ' - '

            response_text += response_info['d']['ConnectionStatus'] + ' - '

            response_text += str(response_info['d']['ConnectionTime']) + ' ms response'

        else:

            response_text += "I'm sorry, Captain, but my neural nets can't process your command."

 

        details['text'] = response_text

        json_response = json.dumps(details)

        self.response.headers['Content-Type'] = 'application/json'

        self.response.write(json_response)

 

app = webapp2.WSGIApplication([

    ('/slackdemo', SlackDemo),

], debug=True)

 

 

I'll do a little explaining here.

  • We'll set up ZSLACK_DEMO_SRV in the next post, part 3.
  • To use Basic authentication, you'll need to take some credentials with access to your SAP Gateway and turn them into base64 encoded characters. One easy way is to bring up the Chrome javascript console (ctrl-shift-j), type "btoa('USERNAME:PASSWORD')", and take the resulting string. Obviously use a real user and password here.
  • Take the slack_token value from the screen where you set up your slack slash command in part 1.
  • The app configuration at the bottom will make it so that you should configure slack to send its commands to https://<your-domain>/slackdemo. Change that to whatever you like.
  • We treat the 'shout' text as a command to send the result of the command to the whole chat window. Otherwise the command will respond only to the person who sends the command and others won't see it.
  • We look for the word 'test' as the key to actually invoke our functionality. If we don't find that, Commander Data will respond with his polite apology.
  • We look for the name of the RFC by splitting the command up into words and then just taking the last word. Python has this nice little syntax for lists where index [-1] is the last element of the list. text.split()[-1] does this for us.

 

Build the project and deploy it to the web site you're using. Now we're ready to create the Gateway service that will do the simple RFC test that Commander Data did in part 1.

 

Off to part 3!

Slacking Off (3 of 3)

$
0
0

(This is Part 3 of a 3 part series. See Part 1 and Part 2 for the whole story.)


In the last 2 posts we paved the way to get some data out of SAP from Slack. First, we set up Slack to send out a request when a user enters a Slash Command. Then, Google App Engine handles that request and forwards it to Gateway. Now Gateway needs to respond back to Google with the RFC connection test that the Slack user asked for.


Here's a simple OData service setup that will test an RFC connection on the ABAP system. My intention is to inspire you to do other cool solutions - I'm just setting this up to show off quick-n-dirty style to explain concepts. Take this and make something else work for you!


Go to SEGW and create a service. I called mine ZSLACK_DEMO. Here's an example setup of the fields for an entity called RfcDestination:


segw for slack service.PNG


Then code up the RFCDESTINATIONSE_GET_ENTITY method in the generated class ZCL_ZSLACK_DEMO_DPC_EXT (assuming you kept the same names I used). Make sure you generate the project first, and then do the redefinition process for the method I mentioned. Here's a great document on setting up class-based Gateway services that goes more in-depth.


Here's a simple implementation of an RFC ping method that matches up with the service we created.


   METHOD rfcdestinationse_get_entity.
     DATA: lv_start TYPE i,
           lv_end TYPE i,
           lo_ex TYPE REF TO cx_root,
           lv_rfcdest TYPE rfcdest,
           ls_key_tab LIKE LINE OF it_key_tab.

     READ TABLE it_key_tab INTO ls_key_tab WITH KEY name = 'Destination'.
     IF sy-subrc IS INITIAL.
       lv_rfcdest = ls_key_tab-value.
     ENDIF.

     er_entity-destination = lv_rfcdest.

     TRY.
       GET RUN TIME FIELD lv_start.
       CALL FUNCTION 'RFC_PING' DESTINATION lv_rfcdest
         EXCEPTIONS
           system_failure        = 1
           communication_failure = 2
           OTHERS                = 99.
       GET RUN TIME FIELD lv_end.

       IF sy-subrc IS INITIAL.
         er_entity-connection_status = 'OK'.
         er_entity-connection_time = ( lv_end - lv_start ) / 1000.
       ELSE.
         CALL FUNCTION 'TH_ERR_GET'
           IMPORTING
             error           = er_entity-connection_status.
       ENDIF.

     CATCH CX_ROOT INTO lo_ex.
       er_entity-connection_status = lo_ex->get_text( ).
     ENDTRY.
   ENDMETHOD.


Maybe not production quality, but ready to do the trick. For a good connection, it will give you an OK ConnectionStatus and a number in milliseconds for the response time. For a bad connection, it will respond with the RFC error in the ConnectionStatus field. Our Google App Engine web server receives this and plugs it into a text response to Slack. When Slack receives the response it puts the text into the chat window for the user who requested it.


Wrapping Up

Assuming all the pieces of the chain have been done correctly, you can activate your slash command. Try it with something like "/ask-sap shout out a test of RFC <your_destination>". If you're all set, the chat window will shortly return to you with a response from SAP.


This was a very simple prototype implementation - but there are so many things you could do! I'll leave you with a brain dump of ideas to inspire you beyond my work.

  • Set up a batch program that looks for new work items for people and send them a Slack message. Wouldn't it be cool to get a digest of everything I need to approve in one private chat window every morning? Check out Incoming Webhooks for more possibilities here.
  • Incoming webhooks would even be a simple way to enable some basic real-time notifications. User-exits or enhancement points could be created with knowledge of the webhooks and respond right away to almost any ABAP event.
  • Plug in some of your master data creation processes (assuming they're simple enough) to fit into a command. Imagine "/sap-create new business partner PAUL MODDERMAN <other field> <other field>".
  • Slack is cloud-based and doesn't require any complex VPN setup. The Slack app on your smartphone would be an easy way to enable your processes for quick response.

 

Go make cool stuff!

Consequences to custom functionality when upgrading from SAP PLM 7.02 (EHP6) to SAP PLM 7.47 (EHP7)

$
0
0

Introduction

 

This blog intends to raise awareness about consequences to custom functionality when upgrading from SAP PLM 7.02 (EHP6) to SAP PLM 7.4.7 (EHP7). In addition to explaining the consequences, a solution is given to minimize the impact caused by the upgrade.

 

Background

 

I doubt no SAP customers are live with a out-of-the-box standard SAP PLM solution. Most SAP customers have the need to customize, enhance, modify, etc. the standard solution. If you are one of those SAP customers or an implementation partner for one, read on.

 

Technical Details

 

Upgrading from EHP6 to EHP7 changes core APIs used by the PLM Web UI. The most disruptive changes are contained in SAP notes 1861056 and 1980389. The SAP notes change the implementation of methods GET_HEADER and CHANGE_HEADER of the class /PLMI/CL_DIR_BO. The class /PLMI/CL_DIR_BO, as you may already know, is a central class in PLM Web UI. The class serves as a layer between PLM Web UI and SAP DMS. The disruptive change is that the methods GET_HEADER and CHANGE_HEADER no longer work correctly if used outside of the PLM Web UI context, be it from SAP GUI, RFC or background job.

 

The disruptive code in method GET_HEADER

  case /PLMU/CL_FRW_APPL_CNTRL=>GV_MODE.

    WHEN'C'.

      lv_activity = '02'.

    WHEN'D'.

      lv_activity = '03'.

    WHEN'I'.                                          " SAP Note 1962595

      lv_activity = '01'.                              " SAP Note 1962595

  ENDCASE.

 

Similarly, the disruptive code in method CHANGE_HEADER

 

IF( /plmu/cl_frw_appl_cntrl=>gv_mode = /plmu/if_frw_constants=>gc_mode-change )AND( /plmi/cl_dir_bo=>MV_NOT_SAVED isINITIAL).

    lv_tcode = 'CV02'.

 

The problem is that the static attribute /PLMU/CL_FRW_APPL_CNTRL=>GV_MODE has a value of ' ' outside the PLM Web UI context, any code depending on the correct mode of operation (Display, Change or Insert) will work incorrectly.

 

Solution by SAP

 

I reported the problem to SAP support and it was escalated properly. The official answer from SAP is that the PLM APIs are not released for customer use and customers are responsible for making required changes to their own code. Based on the answer I was provided, SAP doesn't even consider the issue severe enough to warrant a SAP KBA.

 

Solution by Yours Truly

 

If you have implemented any custom functionality, enhanced or modified existing functionality that uses the /PLMI/CL_DIR_BO class, chances are your changes won't work after the EHP7 upgrade (or whenever the SAP notes mentioned previously are installed in the system).

 

Regardless whether you are planning an upgrade or not, you should make sure that your custom functionality isn't using the GET_HEADER or CHANGE_HEADER methods of class /PLMI/CL_DIR_BO, assuming the custom functionality is used from SAP GUI, RFC or background jobs. You should start changing your code to use DMS function modules instead of using the PLM Web UI APIs.

 

Conclusion

 

I wonder how long SAP is going to stick to their "not released for customer use hence we can break it whenever we want even if customers suffer" mantra. It might have worked in the in the 90's but somehow I don't believe it will work in the future. Now that all efforts are on HANA, let's hope they do a better job with APIs there.


Time to be HeaRd! This Week in ASUG’s HR Community

$
0
0

cup_phones.jpg

 

Listening is essential in the HR world. For a team that touches almost every part of the business, there is a lot to know and a lot of expectations. Employees want to feel like HR will listen if there is a concern or issue. Potential new talent expects HR to listen to their questions and make sure the job is a right fit. When listening to everyone and everything, you may ask yourself, When is my chance to be heard? With ASUG, the answer is now! This week, the ASUG HR Community was all about you, because it’s time for someone to listen to the listeners.

 

HR Town Hall - Focus on Services and Support in the Cloud: Who better to chat with than the people behind the solutions you use every day? Jamie Bridwell from SAP® SuccessFactors® took on your questions and comments in this open and honest Town Hall webcast. Watch the recording of this webcast and see if your concerns were addressed. If you still have more to say, keep your eye on the ASUG HR Community for upcoming Town Halls.

 

HR Compensation Special Interest Group (SIG) Town Hall: Yes, another Town Hall! This time, the HR Compensation SIG takes the stage to chat with the HR Community. How are compensation processes enabled with SAP SuccessFactors? What are your options? What about new users? Listen in on this webcast recording to find out – it’ll be posted soon.

 

Future of SAP and SAP SuccessFactors Consulting 2016 –  SAP SuccessFactors Talent Management (Part 3): Jarret Pazahanick is back again with the third and final chapter in his series on what’s next with SAP and SAP SuccessFactors. In this part, the focus is on talent management. See the HR landscape through the eyes of SAP and SAP SuccessFactors experts as Jarret brings us full circle on the past, present, and future of HR. Here you can find Part 1 and Part 2.

 

Help HR Manage Workforce Changes More Intuitively and Effectively with SAP’s New Intelligent Services Capabilities: While that title is very self-explanatory, I cannot stress how important this webcast is. Intelligent Services (IS) is the biggest innovation in the SAP SuccessFactors HCM Suite. Andy Yen from SAP SuccessFactors is here to tell you why and how you can make the most of this new development. Find out why IS will have you cheering with joy and drawing the attention of your coworkers.

 

Upcoming - It's All About Leadership: Developing the Leaders Around You: I would almost rather not post this because it reminds me that this is the last part of Joan Choate and Diana Wood’s leadership webcast series. We talked about Management Versus Leadership and Developing the Leader Within You, but now it is time to look at how we can help each other be better leaders. In the spirit of encouraging leadership, I implore you to register for this webcast and bring along someone who you think would make a great leader.

 

To see more HR resources and content, check out the ASUG HR Community and stay on the lookout for next week’s recap.


SAPPHIRE NOW and ASUG Annual Conference attendees: Don’t forget to register for our Recharge HR half-day Pre-Conference Seminar. See you there!

 

Photo courtesy of Afrinational

Hitachi Data Systems, Oxya, and SAP: Transforming IT

$
0
0

Businesses today have to contend with a new wave of innovative technology start-ups that are able to move quickly to capitalize on changes in the business environment. To compete, traditional businesses will have to change their operating model and look to IT as a source of innovation and competitive advantage.  In line with this, 2016 will see businesses shift their IT focus from infrastructure to application enablement, with more of the IT budget going to application development, analytics and big data. This sets the stage for some of the key IT trends that we see emerging in 2016.

Airbnb.jpg

An example in the hospitality business is Airbnb. This is a company that was stated in 2008 and now has a $25.5B valuation compared to a large hospitality corporation like Marriott, which was started in 1927 and has a current valuation of $20B. Airbnb is a software company and their IT is in the cloud. They connect people who are looking for lodgings with people who can provide lodging for a fee over the Internet.  In his state of the Union message in  January last year President Obama urged that the United States end the 50 year embargo against Cuba creating a surge of demand for expats, businessmen, and tourists to go to Cuba. By April Airbnb was booking rooms in Cuba. How long do you think it will take Marriott to have hotels in Cuba?

 

There are two major disruptions happening here. First the most obvious is that Software and services businesses that use the social, mobile, analytics and cloud are more agile than brick and mortar businesses becasuse they can focus more on applications than on infrastructure. The second disruption is on the customer side. The consumer that used to book a standard room at a hotel, is now a “pro”sumer, who is empowered to book whatever, where ever, and at any fee that he chooses. Are there many people who want to find lodging this way?  The meteoric rise in Airbnb’s valuation would suggest there is.

 

To compete, traditional businesses will have to transform their operating model to become more agile and connect with customers who are more sophisticated and empowered.  Business will need to look to IT as the Information Technology experts for innovation and competitive advantage.  In line with this, 2016 will see businesses shift their IT focus from infrastructure to application enablement, with more of the IT budget going to application development, analytics and big data. IT transformation will be a key driver for success in 2016.

 

The value of traditional IT can be thought of as a triangle with more than 50% of the value and focus on infrastructure. In this new business environment, we need to turn that triangle upside down, and focus on the value that we bring to the end users through application development and analytics.

IT Triangle.jpg

Infrastructure is still very important.  In fact it is still at the base and is the tip of the spear that facilitates the penetration of development platforms and applications. However, infrastructure must take less of IT’s time, effort, and budget. This can be done through virtualization, automation, software defined, and cloud. Hitachi Data Systems is providing virtualized, converged, and software defined infrastructures that enable the upper layers in the IT stack, development platforms and applications/analytics, to become more agile and scale to meet businrequirements

 

SAP is a key contributor to this transformation in IT. The move to in-memory databases will gather momentum as faster reporting and analysis deliver a clear competitive advantage in today’s real-time business environment. The consolidation of SAP’s business suite onto the HANA in-memory database with S/4 HANAis a great example of how we can turn the IT cost triangle upside down.

 

SAP S/4 HANA allows for a radically different application architecture to simplify the data model. The transactional processing is drastically simplified where all data is completely in memory, using only one copy of the data in a columnar store. A new UI and a new framework for modifications and extensions enables more resources to be shifted from operations to applications and realtime analytics.

 

Hitachi Data Systems partners with SAP to simplify and optimize the infrastructure through converged UCP systems that are tested and certified for SAP HANA. Oxya, a subsusidiary of Hitachi Data Systems, provides SAP technical and managed hosting services. OXya is SAP Certified in hosting and cloud services, supporting global and midsized organizations and hosting nearly 250,000 SAP users worldwide across all industries. Oxya can work with customers to design a dedicated infrastructure and tailor IT solutions to meet their stringent  SLAs for SAP.

 

Hitachi Data Systems, Oxya, and SAP are positioned to help transform IT

 

Blog originally posted on the HDS SAP Community by Hubert Yoshida

SAP HANA Cloud Platform @ AribaLive 2016

$
0
0

Introduction

 

IMG_3805.JPG

While not my first time in Las Vegas, this was my first time at AribaLive.  From various emails and YouTube videos, I was pretty excited about attending this event, well beyond that, I was excited about having the opportunity to evangelize the SAP HANA Cloud Platform (HCP) story with a new group of people and beyond that, learning more about the Ariba Business Network.  This opportunity was afforded to me by our team, Jeramy Ivener & Michael Bain who are the HCP leads for our go-to-market and center of excellence teams, respectively.  I’ve always held the belief that in order to help solve a business problem, actually any problem for that matter, you must first understand the business.  Ariba was a completely new area for me, so being completely immersed in this topic for next several days was going to be awesome.

 

Event Organization

 

IMG_3798.JPG

The event was 3 days long and it took over the Cosmopolitan Conference center.  The first day consisted of various workshops were designed to allow businesses to network with their peers and learn best practices.  There were various workshops that focused on the buyer, seller, partner sides of the house respectively.  Many, if not all of these, require pre-registration, because of this, I was not able to attend. Which is why I cannot offer you any more details.  Towards the latter half of the day, into the early evening, there was a welcome reception that introduced all attendees to the “Commerce Pavilion.”  If you’ve attended other conferences, this is equivalent to a showroom floor which consisted of various partner booths and Ariba Demos.  The next couple of full days always kicked off with breakfast in the commerce pavilion and then leading into daily keynotes presented by Ariba executives and invited leaders in the various industries.  The remainder of the day were well organized breakout sessions that made it easy for you to create an agenda and follow the key topics you wanted to learn about. This meant that if you were interested in a particular learning track, e.g. as a buyer or as a seller, you would minimize the scheduling conflicts.  Tuesday was capped off with a huge party at the Marquee day/night club which allowed attendees to further mingle and build up their networks and socialize.

 

AribaLive!

 

IMG_3812.JPG

The content of the conference centered around some key themes that were mirrored in the keynote presentations and breakout sessions.  Aside from what you’d expect from the keynotes, customers sharing their stories of how the Ariba Network has shaped their business and contributed to their success, this year there was a strong message about digitization and the digital economy.  For many of us in technology, we sort of take it for granted that working with a computer or staying connected 24/7 is just like the air we breathe and sometimes think that this cannot somehow be incorporated into other industries like agriculture or construction as examples.  But the trends have shown that digitization has not only penetrated these industries, but transforming the way they do business - making them operate more efficient and faster.  The common overtone about digitization was certainly speed. Allowing business to be agile and adapt to changes very quickly, all facilitated by digital information.

 

IMG_3813.JPG

Another effect of digital transformation was the need for customization, specifically adapting to the needs of the industries.  It only made sense that as more and more users begin to use the system that their needs also differ from the existing base and opportunities arise to help make the solutions fit their specific needs.  Whether these come in the shape of creating questionnaires that are specific to their needs, or perhaps creating an interaction interface for a user group who just wants to order something simple.  These were features that were then supported by some new features in Ariba, including customizable forms.

 

IMG_3887.JPG

With the increase the demand for more customizations or extensibility of solutions, there is the need for help, help from an ecosystem to help create these features and functions to tailor towards the specific business needs. Therefore, Ariba was adopting quite a new strategy for them, which is openness.  Later this year, they will be opening up more API’s, beyond their current cXML based services that will allow the partner ecosystem to build new extension applications to help fulfill these additional requirements and needs for their customers.

 

SAP HANA Cloud Platform


IMG_3922.JPGIMG_3919.JPG

This was a perfect fit for SAP HANA Cloud Platform to be present at this event, as HCP is the platform that brings together the tools and services needed to help those partners and customers realize those extensions.  We were present in the commerce pavilion.  We also made a presence in the digital world by interacting and reacting in the twittersphere with the attendees at the conference.  Lastly, Jeramy, Michael, and I had a breakout session towards the end of the conference where people were primed to wanting to know more about how and where they could build these extensions.  Although we were the last session of the last day of the conference, there ended up being standing room only.  There was a good mixture of Aribians, Customers, and Partners in attendance.  In this breakout session, we introduced to SAP’s Platform as a Service.  We went through a discussion on the use case and demonstrated sample solutions on the possible solutions and scenarios that could be built using Ariba & SAP HANA Cloud Platform.  Amongst this was also a deeper dive into one the demo that Alex Atzberger had presented onstage that showed how IoT and predictive maintenance could help businesses stay up and always running.  The scenario showed how being able to monitor machinery, in this case a dump truck who has sensors monitoring the temperature of its engine.  Predictive maintenance has shown that over the course of a few days, the operating temperature has trended upwards and failure is predicted to occur in the next 5 days.  An alert is given to the operator, who never has to exit the system, but then is presented with a cause and course of action.  The action presented is the opportunity to locate the replacement parts, and then place the order with a supplier through the Ariba Network.  Order is confirmed and delivery date is given and made sure that the part arrives in time before the machine fails.  If you think about all the assets in the SAP portfolio, the scenario could be even extended beyond this, where, perhaps the expertise to repair the machinery has to be sourced, of course then we could also find this through SAP Fieldglass.  How about making the travel arrangements for this worker, SAP Concur.  The opportunities are quite numerous on how SAP HCP, which has the native capability to integrate with these SAP solutions (and others).

IMG_3920.JPG

The Impressions

 

My three take aways from this event were: I loved the way it was organized; At the ground level, many inside Ariba still do not understand SAP HANA Cloud Platform; with Ariba opening up API’s there will be a pull of information how these API’s can be used to build out scenarios.

 

While the event was only 3 days, it was non-stop, content was jam packed.  Somehow though, it was very nicely arranged. As mentioned, I liked the fact the tracks were well organized and that overlap was minimized.  Another great thing was that the sessions all started and ended at the same time, so you didn’t have to worry about missing part of a session.  The breakfasts and lunch were all centered in the Commerce Pavilion, and with no breakout sessions scheduled for this time, everyone ate at togther and actually had a chance to network and socialize.

 

I spoke to quite a few number of Aribians, and it was clear that many had heard of HANA, HANA Database, HANA Platform, HANA Enterprise Cloud, HANA Cloud Platform, with all these HANA’s as they put it, the story as to what each was, was a little confusing.  I think there is an opportunity for HCP, HANA Cloud Platform to help clarify the story and distinguish our capabilities from the other HANA family of products.  The first method was sharing a concept that has been voiced by Steve Lucas and many in leadership; there is HANA the brand, and HANA the products, including the database, the platform, of course us, the HANA Cloud Platform – SAP’s Cloud Platform as a Service.

 

The timing couldn’t be more perfect as Ariba adopts and open API strategy, customers and partners are already asking well where should do this. With us being here at the ground floor, HCP will help provide many of the answers to the fresh questions many will have.  If customers are already using SAP products, why wouldn’t you think about using SAP HANA Cloud Platform?  There will be an opportunity for both Ariba and HCP to help customers and partners provide a solution that helps solve business problems end-to-end.  HCP is close to Ariba and will help customers maximize the value of Ariba’s new open ecosystem.

 

You can find many more of my impressions through the tweets that I had made during the event, you can join me on twitter at https://twitter.com/thesapmic

 

IMG_3803.JPG

NodeJS and SAP HANA, the European Tour!

$
0
0

It's with both pleasure and excitement that I announce the European City tour of SAP HANA XSA, that's our SPS11 release of the SAP HANA server and XS Advanced (NodeJS development). Thomas Jung and Rich Heilman will be flying over to run the very first SAP CodeJam events on this topic in 3 cities in Europe to kick off the topic being generally available for worldwide!

 

  • April 19 - Germany
  • April 22 - Netherlands
  • April 25 - United Kingdom

 

Our first stop will be Hannover, Germany where Inwerken has offered to host us! Seats are of course limited, https://www.eventbrite.com/e/sap-codejam-hannover-registration-23062667058 from there we will head over to the Netherlands where Ciber has agreed to our second stop, https://www.eventbrite.com/e/sap-codejam-eindhoven-registration-23185169466, then we'll give the two experts a short breather until our 3rd and final stop on Monday in London at the Bluefin offices! https://www.eventbrite.com/e/sap-codejam-london-registration-23062375185

 

Remember the SAP CodeJam is a request based event that takes 5 to 6 hours in a single day and our goal is to expose you to the technology and get you familiar with it! Considering this is a new topic for us we have some hiccups but with two of the world's best experts on site we are sure it will be a fantastic event!

 

Some of you may be thinking, come on 3 cities and you call that a tour? What about our location? Well the good news is, starting May 1st for the APJ region I have almost a dozen experts ready to host events! For Europe and the rest of the world my number of experts is unfortunately not as high but if you have a interest in this topic go ahead and send in your requests!!!

 

So why the fuss and big deal about XSA? Well for starters it opens a new world of opportunity up for developers! SAP HANA SPS 11: New Developer Features; Tooling - Getting Started and like Paul we are curious as well how people will take to it My initial thoughts on HANA SPS11 and XS Advanced

My CDS view self study tutorial - Part 5 how to create CDS view which supports navigation in OData service

$
0
0


 

So far we have a working CDS view ready for us to create a UI5 application on top of it via Smart Template in WebIDE within just a couple of minutes. Once done, the UI5 application will display the data from our CDS view like below. For step by step how to achieve this, please refer to this blog: Step by Step to create CDS view through SmartTemplate + WebIDE .


clipboard1.png

How is navigation implemented among CDS views

 

In this part, let's create CDS view which supports node navigation in OData service. The previous CDS view we created has a flat structure which only have a root node. Now let's create a series of CDS views:

 

1. A CDS view which contains two fields: spfli.connid and spfli.carrid. This view acts as the root node of the corresponding OData service model from semantic point of view. This view can support navigation from itself to the defined children node.

 

2. A CDS view which acts as the navigation target from previously defined "root" view. Besides the two fields from sflight.connid and sflight.carrid which correspond to the root view, it has additional new field sflight.fldate.

 

OData navigation means suppose currently I am in the context of spfli.connid = 0001 and spfli.carrid ( data record with yellow ), and through navigation I can get all its dependent data in red color. We will see how this navigation would be performed later.

 

clipboard1.png

3. A CDS view which exposes the two fields connid and carrid from root view and the associated data  from child view.

This view is called "consumption" view and used to published as OData service.

 

Source code of view #1:

 

@AbapCatalog.sqlViewName: 'zspfliroot'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'root view'
define view Zspfli_Root as  select from spfli
association [0..*] to Zsflight_Child as _Item on $projection.carrid = _Item.carrid                                       and $projection.connid = _Item.connid
{  key spfli.connid,  key spfli.carrid,  @ObjectModel.association.type: #TO_COMPOSITION_CHILD  _Item
}

Source code of view #2:

 

@AbapCatalog.sqlViewName: 'zsflightchild'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'child_view'
define view Zsflight_Child as select from sflight
association [1..1] to zspfli_root as _root
on $projection.connid = _root.connid
and $projection.carrid = _root.carrid
{   key sflight.carrid,   key sflight.connid,   key sflight.fldate,   @ObjectModel.association.type: [#TO_COMPOSITION_ROOT, #TO_COMPOSITION_PARENT]   _root
}

Source code of view #3:

 

@AbapCatalog.sqlViewName: 'zflight_c'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'flight consumption view'
@OData.publish: true
@ObjectModel: {   type: #CONSUMPTION,   compositionRoot,   createEnabled,   deleteEnabled,   updateEnabled
}
define view Zflight_Com as select from Zspfli_Root {  key Zspfli_Root.carrid,  key Zspfli_Root.connid,  @ObjectModel.association.type: [#TO_COMPOSITION_CHILD]  Zspfli_Root._Item
}

Activate all of these three CDS views. Since the third consumption view has annotation @OData.publish: true, once activated there will be an OData service automatically generated:

clipboard3.png

How to test navigation

 

First check the response from OData metadata request via url /sap/opu/odata/sap/ZFLIGHT_COM_CDS/$metadata in gateway client.

You should find two AssociationSets generated based on corresponding annotation in CDS views.

clipboard4.png

The entityset Zflight_Com has type Zflight_ComType, which has the navigation Property "to_Item". Now we can test the navigation.

clipboard5.png

First we get the root node's content via url: /sap/opu/odata/sap/ZFLIGHT_COM_CDS/Zflight_Com(connid='0400',carrid='LH') .

clipboard6.png

And in the response, we are told that the correct url for navigation from current node to its child node is just to append the navigation property defined in metadata, toItem, to the end of url, that is, /sap/opu/odata/sap/ZFLIGHT_COM_CDS/Zflight_Com(connid='0400',carrid='LH')/to_Item .

clipboard7.png

How the navigation is implemented in ABAP side

 

Set the breakpoint in the method below and re-trigger the navigation operation.

Check the generated SQL statement in variable statement in line 27.

clipboard8.png

SELECT "Zsflight_Child"."CARRID" AS "CARRID", "Zsflight_Child"."CONNID" AS "CONNID", "Zsflight_Child"."FLDATE" AS "FLDATE" FROM "ZSFLIGHTCHILD" AS "Zsflight_Child"
WHERE "Zsflight_Child"."CARRID" = ? AND "Zsflight_Child"."CONNID" = ? AND "Zsflight_Child"."MANDT" = '001' WITH PARAMETERS( 'LOCALE' = 'CASE_INSENSITIVE' )

 

The value for two placeholders ( ? ) are stored in me->parameters->param_tab:

clipboard9.png


And check response in et_flag_data:

clipboard10.png

clipboard11.png

Viewing all 2548 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>