Quantcast
Channel: SCN : Blog List - All Communities
Viewing all 2548 articles
Browse latest View live

VBA Function SAPBExSetVariable not working as required in BExAnalyzer 7x

$
0
0

Usage of BExAnalyzer 3.5 Macros and VBA Function:

 

In 3.5 BExAnalyzer you can use SAPBExSetVariable to set the variable to a query that you would like to see the result once the query is refresh in 3.5 BExAnalyzer

 

Following macro are typically used to refresh a single Data Provider with in a workbook.

 

  • Run("SAPBEX.xla!SAPBExSetVariables", lRange)

 

lRange contains the Range address of the Variable Values (Structure RRX_VAR: VNAM, VARTYP, VPARSEL, SIGN, OPT, LOW, LOW_EXT, HIGH, etc...)

 

  • Run("SAPBEX.xla!SAPBEXSetVariable", False, lRange)

 

The second parameter define if all Data Provider are to refereshed or if only the Data Provider located within the provider Range lRange.

 

This is the flow of BExAnalyzer 3.5 VBA Function SPABExSetVariable. However, the support of 3.5 BExAnalyzer is over and you may would like to use the function in 7X BExAnalyzer.

 


Usage of VBA Function SAPBExSetVariable in BExanalyzer 7X

 

When the workbook is upgraded to BExAnalyzer 7X the macro are automatically adapted to the new VBA Code of 7X BExanalyzer and it will look like

 

  • Run("BEXAnalyzer.xla!SAPBExSetVariable", lRange)
  • Run("BEXAnalyzer.xla!SAPBExSetVariable", False, lRange)

 

And you have also implemented the SAP Note "1849135 : Single Data Provider Refresh Functionality NEW"

 


Issues in current flow:

However, you may find that this SAPBExSetVariable is not working as required in BExAnalyzer 7X especially in case of

  • Executing this macro more than once with different values for variable.
  • Current flow also modifies the variable value for query for which you did not request for SAPBExSetVariable.

 

There are some more issue with the flow of SAPBExSetVariable and refresh.

 

Improvement / Enhancement in BExAnalyzer 7X for SAPBExSetVariable


SAPBExSetVariable is typically 3.5 BExAnalyzer supported function and this feature can work only if you would have already applied a note “1849135 - BExAnalyzer: Single Data Provider Refresh Functionality NEW”.

This is because 3.5 BExAnalyzer refresh single query at a time, however the 7.x architecture is very different then 3.5 Refresh. Therefore this feature SAPBExSetvariable can only work if the single data provider new and the note 1849135 is implement correctly.


In this note 184915, there is an information mentioned that the “The Flag "Allow Refresh Function for individual Queries" in the Workbook Settings Dialog in the Tab "General" does not need to be set for the macros to work.”


There were some inconsistency in flow of processing the single data provider refresh new and SAPBExSetVariable and so in new supported SAPBExSetVariable this Flag is required and mandatory to be set to make SAPBExSetVariable to work along with the new correction mentioned in the SAP Note 2265155 - VBA Function SAPBExSetVariable not working as required in BExAnalyzer 7X.



Usage of SAPBExSetVariable in BExAnalyzer 7X:


To use SAPBExSetvariable in 7.x you have make following changes in your excel range, as of now the you were only mentioning about the variable and its information as per the range address of the Variable Values (Structure RRX_VAR: VNAM, VARTYP, VPARSEL, SIGN, OPT, LOW, LOW_EXT, HIGH etc.).


The new changes here is you also have to maintain Data provider name as shown below in your excel range.


 

DATA_PROVIDER<DATA PROVIDER NAME><DATA PROVIDER NAME>
VARIABLE NAME11IIBT<VALUE><VALUE>
VARIABLE NAME21IIBT<VALUE><VALUE>
VARIABLE NAME31PIEQ<VALUE><VALUE>


So as earlier you were only mentioning about the variable information in your excel sheet cell but now with new changes you also have mentioned the data provider name. As recomendation it will be more earier if you can mentione the dataprovide name first.


Now you also need to do some modification in your VBA Code


BExAnalyzer 7X VBA Macro code example:


strVBA = Run("BExAnalyzer.xla!SapBEXsetVariables", Sheets("<Sheet Name>").Range("<Range Address>"))

 

'<Sheet Name> where the data provider exit to be refreshed.

 

Set shtRefresh = ThisWorkbook.Worksheets("<Sheet Name>")

 

strVBA = Run("BExAnalyzer.xla!SAPBEXrefresh", False, Sheets(shtRefresh.Name).Range(<Range Address>))




Prerequisite to enable SAPBExSetVariable flow in BExAnalyzer 7x


  • Please implement the notes 1832908, 1849135
  • Note 2265155 - VBA Function SAPBExSetVariable not working as required in BExAnalyzer 7X
  • Transaction RS_FRONTEND_INIT, Set parameter ANA_SINGLEDPREFR_NEW = 'X'
  • Either set “The Flag "Allow Refresh Function for individual Queries" in the Workbook Settings Dialog or Global Setting in tab “General”. 

Partnering opportunities in Personalized Medicine

$
0
0

Personalized Medicine offers a unique and exciting opportunity to create high value for customers, patients and consumers. SAP Foundation for Healthis the platform of choice to enable personalized medicine, andSAP Medical Research Insights is the first application built and delivered on top of that platform.

Partners that are interested to build applications on the SAP Foundation for Health can now do so within the framework of theSAP PartnerEdge program for Application Development:

  • Both SAP Foundation for Health and SAP Medical Research Insights are now part of the software delivered with theApplication Integration Innovation Pack.
  • Together with the HANA & Database Innovation Pack, partners can now reap the full benefits of both the platform and the ISV-focused partner program.
  • Currently, on premise solutions (as opposed to cloud) can be created.

 

This is the most efficient way for any partner, new or existing, to team up with SAP in Personalized Medicine.

 

The SAP PartnerEdge program for Application Development is SAP’s groundbreaking, market-leading partnering model complementing SAP’s technology platforms, which makes it very easy for a partner to build, market and sell applications

  • All info about the SAP PartnerEdge program for Application Development ishere
  • Partner applications are published on theSAP Store. Through this channel, partners can reach all of SAP customers
  • Sign-upto the program is straightforward. Information about program fees is also in that location

 

274479_h_ergb_s_gl_SMALLER.jpg


Blog It Forward - Vijay Kalluri

$
0
0

Namaskaram!!,

 

This is the word in Telugu language which means “Hello”. Telugu – It is my Mother Tongue and Most spoken language in Andhra Pradesh.This is quite interesting to know about each other’s in the SCN community

 

A special thanks toDibyendu Patra for giving me an opportunity to introduce myself to SCN community.You can have a look at the entireBlog It Forward (BIF) Chainto see how interesting this is.


To know more details on the Blog It Forward Challenge, check out more info here: Blog It Forward Community Challenge Blog It Forward Community Challenge


If you wish to join the challenge and haven’t been tagged by anyone yet, list your details here for someone to invite you: Blog It Forward- Request to Join Table Blog It Forward- Request to Join Table


Introduction:


My name is Vijay Kalluri born and brought up in Kandukur a small town in Andhra Pradesh (AP). (India). I did my graduate Computer Science & Information technology from Prakasam Engineering College (PEC) (JNTU University) in 2006


I started my career with CMC Ltd (it’s part of TCS) as a fresher. Later on I worked with Msat (Mahindra Satyam). Currently I am working in Cognizant Technology Solutions (CTS) for the past 1 years from Bangalore.


Having 9 years of on SAP Net Weaver Portal/Enterprise Portal, Webdynpro Java, BPM/BRM, ESS/MSS & SAP UI5(JQuery, JavaScript, HTML5, CSS3, AJAX, JSON)

 

Currently I’m learning advanced skills (SAP FIORI/ SAP Net Weaver Gateway) and I am eager to learn to HANA development.

Vijay.JPG                     Vijay-1.JPG


Fun Facts about 'Cognizant Technology Solutions':


Cognizant Technology Solutions is an American multinational corporation that provides custom information technology, consulting, and business process outsourcing services. It is headquartered in Teaneck, New Jersey, United States. Over two thirds of its employees are based in India


Cognizant Technology Solutions is stated January 26, 1994


      • Cognizant is listed in the NASDAQ-100 and the S&P 500 indices
      • Number of employees in Cognizant Technology Solutions 2, 17,700 (March 31, 2015)


Here you go for more information about Cognizant http://www.cognizant.com& http://en.wikipedia.org/wiki/Cognizant

and Rank 7 Cognizant : Top 10 Information Technology (IT) Companies in World 2015 | MBA Skool-Study.Learn.Share.


cognizant-manyata-tech-park.jpgdownload.jpg

Fun Fact about my Home Town:


Kandukur is a town in Prakasam district of the Indian state of Andhra Pradesh. It is classified as a municipality, which serves as the headquarters of Kandukur mandal.

 

                                                                          Kandukur.PNG

 

Fun Fact about my State (Andhra Pradesh):


Andhra Pradesh was a state in India created on October 1, 1953, from the Telugu-speaking northern districts of Madras State (Tamil Nadu) the state was made up of two distinct cultural regions – Rayala-Seema and Coastal-Andhra. The combined region was commonly called Seemandhra or Seema-Andhra

 

On November 1, 1956, the Telangana region (Hyderabad State) was merged with it to form the united Telugu-speaking State of Andhra Pradesh. When the present Andhra Pradesh State was created, some majority Telugu-speaking The States Reorganisation Commission (SRC) recommended creation of Telangana State before merging with Andhra State after taking public opinion in the scheduled elections of 1961


                                            Andhrapradesh_Telengana.PNG

On 2 June 2014, Telangana State was separated back out of Andhra Pradesh. Now the residual Andhra Pradesh State has approximately the same borders as the old Andhra State of 1956.The residual Andhra Pradesh has lost bhadrachalam revenue division which was part of Andhra State before 1956. Bhadrachalam revenue division is moved to Telangana


New Andhra Pradesh is one of the 29 states of India The state has a coastline of 974 km (605 mi), the second longest among all the states of India. Could you please refer this video for Sunrise State of Andhra Pradesh



The Capital of now Andhra Pradesh is “Amravati”. Recently AmravatiFoundation done by Andhra Pradesh Govt. Could you please refer this video for



                                  Andhra Pradesh.png

Fun Fact about historical places in Andhra Pradesh

 

Tirupati is a city in Chittoor district of the Indian state of Andhra Pradesh.

 

Lord Sri Venkateswara, also known as Srinivasa, Balaji, and Veṅkaṭachalapati, made Tirumala his abode five thousand years ago. Even before him, it was Lord Varahaswami who had made Tirumala his abode. Since then, many devotees have continued to construct grand entrances on the ramparts of the temple over generations. The temple complex is spread over 16.2 acres of land

 

  “Tirumala, in all its right, is heaven. Its powers are indescribable. The Vedas have taken the form of rocks and appeared on Tirumala. Holiness has taken the form of water and is flowing as streams on Tirumala. Its holy peaks are Brahmaloka and other lokas. Srinivasa lives on Seshadri


Kindly refer this Wiki link for more information about Tirupati


Fun Fact about festival in Andhra Pradesh

 

Sankranti


Makar Sankranti marks the transition of the Sun into the zodiac sign of Makara rashi (Capricorn) on its celestial path. The day is also believed to mark the arrival of spring in India and is a traditional event. Makara Sankranthi is a solar event making one of the few Indian festivals which fall on the same date in the Gregorian calendar every year: 14 January, with some exceptions when the festival is celebrated on 13 or 15 January.

        • Day 1 – Bhogi 
        • Day 2 – Makara Sankranti - the main festival day
        • Day 3 – Kanuma

The day preceding Makara Sankranti is calledBhogiand this is when people discard old and derelict things and concentrate on new things causing change or transformation. At dawn people light a bonfire with logs of wood, other solid-fuels and wooden furniture at home that are no longer useful. The disposal of derelict things is where all old habits, vices, attachment to relations and material things are sacrificed in the sacrificial fire of the knowledge of Rudra, known as the "Rudra Gita Gyana Yagya". It represents realization, transformation and purification of the soul by imbibing and inculcating divine virtues

                                      Bhogi-1.jpgBhogi-2.jpg

The second day is Makara Sankranti. People wear new clothes, pray to God, and make offerings of traditional food to ancestors who have died. They also make beautiful and ornate drawings and patterns on the ground with chalk or flour, called "muggu" or "Rangoli" in Telugu, in front of their homes. These drawings are decorated with flowers, colors and small hand-pressed piles of cow dung, called "gobbemma".

 

On the day afterMakara Sankranti, the animal kingdom is remembered and in particular, the cows. Young girls feed the animals, birds and fish as a symbol of sharing. Travel is considered to be inappropriate, as these days are dedicated for re-union of the families. Sankranti in this sense


demonstrates their strong cultural values as well as a time for change and transformation. And finally, gurus seek out their devotees to bestow blessings on them.

On the third day, Kanumais celebrated. Kanuma is an event which is very intimate to the hearts of farmers because it is the day for praying and showcasing their cattle with honor. Cattles are the symbolic indication of sign of prosperity.


                                                                              pongal-1.jpg

Kanuma, Mukkanuma& the day followingMukkanuma also calls for celebrations with union of families, friends, relatives followed by various fun activities, which mainly include**** Fighting, Bullock/Ox Racing, Kite Flying, Ram (Pottelu) Fighting.

On this occasion, in every town and city, people play with kites and the sky can be seen filled with beautiful kites. Children and elders enjoy this kite flying occasion.

  Another notable feature of the festival in Andhra Pradesh is the Haridasa who goes early in the morning around with a colorfully dressed cow, singing songs of Lord Vishnu (Hari) hence the name Haridasu (servant of Hari). It is a custom that he should not talk to anyone and only sing songs of lord vishnu when he goes to everyone's house.

                                                                        Kanuma.JPG

Diwali (Festival of lights)


The festival spiritually signifies the victory of light over darkness. Diwali falls between mid-October and mid-November


    Devali-3.JPG      Devali-2.JPGDevali-1.JPG

Fun Fact about my Country & Culture:

 

One-third the area of the United States, theRepublic of Indiaoccupies most of the subcontinent of India in southern Asia. It borders onChinain the northeast. Other neighbors arePakistanon the west,Nepaland Bhutanon the north, andBurmaandBangladeshon the east.


Language speaking with percentage (%) in my country


Hindi 41%, Bengali 8.1%, Telugu 7.2%, Marathi 7%, Tamil 5.9%, Urdu 5%, Gujarati 4.5%, Kannada 3.7%, Malayalam 3.2%, Oriya 3.2%, Punjabi 2.8%, Assamese 1.3%, Maithili 1.2%, other 5.9%


                                                                  18955517-Indian-Map-with-Cultural-Object-Stock-Vector-india-flag.jpg

How I come to know about SCN and What do I most enjoy on SCN?

My manager and my colleague in Msat (Mahindra Satyam) have asked me to join the SDN(Jul 26, 2011) and contribute answers for the respective question.

I would really thankful to them for suggesting such wonderful site. On SCN there is lot of information and tips and tricks are available with numerous examples.

I enjoy reading/creating blogs, documents and threads on SCN because by reading them I get very valuable information and also help me to increase knowledge. Every time I got best solution on whichever problem I faced regarding SAP and also I leaned SAP UI5/BPM and BRM through SCN.

What is your favorite place in the world?

India is my favorite country in the world. However Europe is my favorite place in the world. My-self & My-wife both like Germany in Europe.

What is my Ambition?

I want to be a good Consultant on SAP UI5, FIORI, Gateway and HANA.

If you have spare time, what 3 activities you do (other than being on SCN)?

      • Spending time with my family
      • Watching News Channels/ comedies / Movies in TV


Individual Leaders in Webdynpro JAVA/Enterprise Portal and JAVA

WDJ.JPGEP.PNG          JAVA.JPG


I was blogged in blog forward by

Jun Wu's Profile | SCN

Saleem Baig Mohammed

Shyamala Kalluri's Profile | SCN

 



Best Regards

Vijay K- Kalluri

 

Thats all for me....

Thank you for reading My blog, and Knowing me.

DB2 to HANA Migration -DB2 Translate () function achivement in SAP HANA

$
0
0

DB2 to SAP HANA Migration -Db2 Translate ()  function in SAP HANA


DB2 Query  :

 

select PHONE_NUMBER,

(TRANSLATE(PHONE_NUMBER,'','1234567890ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz()-/')) as PHNE

from  "Phone_test"


In above query where ever its find any letter from  this string '1234567890ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz()-/' in phone number  It replace with Space like ‘ ‘


result s(Out put) :


Db2 Image.PNG

In SAP HANA there is no  Translate function  or similar function is not available , There is a work around using Regular Expression for similar scenario.

SAP HANA Query  with out put:

HANA Image.png

 

Thanks,

Asif

SAP TM Optimizer performance – An ABAP developer’s point of view

$
0
0

In the last couple of month I got several questions regarding the TM optimizer and its performance. Therefore I decided to write this blog which should give some information how the optimizer works and some tips to improve performance in your scenario (for more information regarding the optimizer and optimizer performance you should check out SAP note 1520433 and the blog series Effective Optmization Series).

 

General
Let me start with some vague information about the optimizer in SAP TM.
When you run the optimizer from TM standard default there are 3 steps which are processed: The optimizer pre-processing, the optimization, and the optimizer post-processing (check process controller strategy VSR_DEF).
In the pre-processing all the relevant data are collected and converted to the optimizer format. This means that data about all selected freight units, freight orders, and resources needs to be collected. Furthermore information about all the possible routes through your transportation network are determined. Also optimization context data needs to be collected. This context includes e.g. orders on selected resources within the planning horizon, to let the optimizer know when the resources are already in use.
The optimizer itself is an algorithm which creates a valid transportation plan and try to optimize the solution regarding the planning costs which are set in the planning cost settings of your planning profile.
In the post processing the optimizer solution is stored back into the TM system which means all the changed data are saved in the system.

There are two main points which will cost performance: volume & complexity. Working with the optimizer you should try to keep one principal in mind "take the data needed, but not more".

opt1.png

 

Data Selection

One important point in reaching a good optimizer performance is selecting right data. Two very important settings here is the selection horizon and the planning horizon. For the right planning horizon you need to analyze your own business. If you are running a day to day business the planning horizon usually will only cover the next few days. It you need to make plans over the next month’s your planning horizon covers a longer time period. Maybe you are using some offset to get your plans ready one or two day before execution. In order to reach a good optimizer performance your selection of freight units and freight orders should depend to the planning horizon. For sure all of those horizons should be as small as possible to improve performance.

The optimizer will never plan any freight unit which have their pick-up and delivery window outside the planning horizon. Therefore avoid to process any of those freight units with time selection attributes in the selection profile.

In the selection profile you can furthermore exclude all freight units which are already planned. Those freight units would be fixed by the optimizer anyways if you don’t also select the freight order which they are assigned to. If you select freight orders the assigned freight units will be added to your selection anyways in order to process them. Please keep in mind that the optimizer will delete the currently planned freight orders and start from scratch for the assigned freight units if you process them in an optimizer run without using incremental planning.

 

Optimization Time

Inthe planning profile you can control how long the optimization should run. As more time the optimizer gets as better the results will be from a planning cost perspective.

Please also check out this setting if you are concerned about the overall optimization time. TM default is set to 20 seconds the engine can use.

Be aware that the right optimizer runtime setting is hardly depending to your scenario. It’s definitely not a good idea to just reduce the runtime in order to get faster response!

opt2.png

 

Notes

Over the last month we did several improvements for the optimizer performance. Those are contained in the following set of notes for optimizer pre- and post-processing.

Here are some of the recent SAP Notes you should implement when facing performance issues. I will try to update the list from time to time.

Optimization in general: 2225934

Optimization resulting in creating multiple freight orders with HBL: 2225719

Optimization using incremental planning: 2225741, 2257414

Optimization with carrier selection: 2188496

Optimization with charge calculation: 2184962

EPM and Analysis Office for BPC: where roads converge

$
0
0

One of the most frequently asked questions from customers using BPC at the moment is which client tool is the best fit for BPC on HANA. In this article we will compare the various front-end tools that can be used in combination with SAP BPC in a simple way. We will do this by drawing a comparison between the SAP BPC Frontend tools: Analysis for Office (AO) and the EPM Add-in which both are used in Microsoft Office. We will conclude with the new SAP convergence client 2.2 which has been released inNovember 2015 and take a look  at what is to come with 2.3 in 2016.

 

Analysis Office vs. EPM Add-in  | It ​seemed a very easy task to list down the differences between the tools before we writing this article, but in practice it turned out to be quite a challenge to identify factors that determine the completeness of a tool. This is comparable to the discussion about  the exact distinction between Reporting and Analysis in SAP Frontend tools, and especially for Planning Applications like BPC NW, Integrated Planning and BPC on HANA 10.1. To have a clear picture of the tools, we started looking at what is shipped by default with Microsoft Office products of which everyone knows how these work, and what advanced users can do with it. We also know how powerful Excel is and how it is the tool of choice within Finance departments. Excel is of course ideal for calculations, pivot tables and focus on content rather than formatting/presentation and data integration. Major missing element of Excel is the ability to easily connect to SAP systems where all actual data is stored. For example, Excel doesn’t hold any integration with actuals versus plan data, making it very difficult to  have one centralized version (one version of the truth) and performance can be a problem if you have complex calculations of data retrievals.

 

 

Analysis Office (just-bi.nl


Analysis for Office
| What make Analysis for Office different from EPM Add-in?In the first place, AO requires a BusinessObjects Platform(if you want to deploy your reports on BO server) and EPM is a standalone client tool (Excel add-in). It also is the successor of BEx (Web) Analyzer and can be installed as Microsoft Excel plugin or can be used as a web version(OLAP). Analysis Office is the right tool for users with extensive analysis needs. It is particularly suited to access and analyze OLAP (Online Analytical Processing) sources. It offers slice-and-dice capabilities and supports use of hierarchies, for example a cost center hierarchy. It is a tool for analysts who know the data and are looking for answers by exploring the data. The dataset is also usually built by the IT organization and is often based on the specifications of the business analyst. AO works nicely for ad-hoc analysis-type of reporting without focusing on formatting and complex calculations (e.g. VLOOKUP’s and pivoting).

 

Analysis for OLAP | Functionality-wise Analysis for OLAP matches  Analysis for Office with the big difference that it runs in a browser. This often has a preference from an IT perspective because the roll-out and maintenance of the tool is much easier. However, the fact that it lacks integration with Excel makes Analysis for OLAP less popular with business analysts. Analysis for OLAP doesn’t have a future in SAP’s plans. It seems to be phased out in the future.

 

 

EPM Add-in | The Enterprise Performance Management or EPM Add-in for Microsoft was released for SAP EPM’s suite as depicted below. This client tool was built to harmonize different Microsoft Office based clients used within the SAP BusinessObjects and EPM portfolio. Previous tools like Extended Analyzer (acquired from Cartesis) used for BPC 7.X and EVDRE were migrated to EPM 10.0. The EPM tool contains huge improvements with Excel integration and provides features and functions for the business users to analyze and report. One of the features of EPM is the ‘Local Members’  which can be explained as ‘dynamic referencing formulas’. This feature is integrated in Excel  and can be created in EPM by the business users without any involvement of the IT organization. This increases the user adoption  of EPM. The business users are now in the driver’s seat and are able to create their   own reports, local formulas, input on data cells and reuse the look and feel by applying the standard EPM formatting sheet. EPM is not only used for BPC but it can also be used for the entire  EPM Portfolio. For example, Financial data (BOFC) can be combined with SAP Strategy Management (SSM) by joining data from different EPM data sources into one report.

 

 



One EPM client for BPC Standard and Embedded version
| EPM has been released for two types of BPC on HANA versions: BPC ‘Standard’ and ‘Embedded ’. The Standard version is like a standalone version within SAP BW and it has a Consolidation engine in it, which is used for Financial Planning & Consolidation. With Embedded version BW on HANA is integrated and Master/Transaction data can be reused without any conversion or formatting. Embedded version can’t consolidate data but it’s the best for Planning, like Budget, Supply Chain Planning, Cost/Sales/Logistic Planning etc. Both versions work  in Excel but with different connections to their data sources.

 

Analysis Office 2.X : Convergence of AO and EPM Add-in| In September 2015 SAP announced a new client tool for EPM and AO called Analytics Office , which includes the EPM add-in and AO in one installation. The tools are just merged technically from an installation perspective and EPM/AO can be used independently from each other: there are two separate ribbons with each their specific functionality. If you are using AO reports- you have to logon in using the AO client and if you want to switch to EPM, an additional logon is required to access the EPM reports. Once you are logged in – the refresh function works across tools and can be used for all reports which are opened in the EPM/AO sessions. We don’t see any added value to move from EPM or AO to the new converged client if you just use one of these tools. From an IT perspective it makes scripting easier only having one script to rollout both tools in one client.

 

 

AO Converged client

Analysis Office 2.2 has been released
|
The new Analysis Office 2.2 has been released in November 2015  with more integration and improvements. With the AO 2.2 version, EPM SP23 was taken as a reference for the EPM part. Analysis Office has been improved with more features like formatting and prompts. But is this version now ready to be used or just for piloting? Eventually you would expect that EPM and AO are merged into one single tool. What we heard explicitly  at SAP’s Strategy sessions is that both tools will be merged into one Office Client and that would be Analysis Office. Something we should expect to be delivered in 2016 but we are eagerly anticipating a new roadmap for Analysis Office 2. 4 and further.

 

 

Latest and greatestAnalysis Office 2.3 has not been released yet but we know already that AO 2.3 is planned for June 2016. What we know about the 2.3 version is not much more than that Analysis Office can handle local calculations similar as in EPM, which would be a big improvement. This means you can have ‘local member’ formulas in rows or columns and you can apply specific formatting to it. Another feature is the ‘work statuses’ (facilitates Data information about planning cycle such as data locking or submission management)  which was not supported in the previous versions. More and more reasons to switch from EPM to Analysis Office for BPC ‘Embedded’ version.

 

 

Wrap-up | We can clearly see SAP is investing in further development and enhancements of AO while EPM only has bug fixing as a priority. This brings us to conclude that within a few years EPM is nearing the end of its product lifecycle, since the introduction in 2011.


Thank you for reading this article and we would appreciate your feedback: analytics@just-bi.nl


References:

1. Magic Quadrant for Business Intelligence and Analytics Platforms, Q1; 2015
2. Analysis Office Roadmap Webcast Notes: Apr 24, 2015
3. BA270 TechEd 2015 Presentation Slides 20-24: Hands-On, Embedded Model in SAP BPC
4. The Forrester Wave™: Enterprise Business Intelligence Platforms, Q1 2015
5. BA261 TechEd 2015 Presentation Slides 9 – 19: Analytics Clients in Microsoft Office
6. Roadmap of Analysis Office: Dec 8, 2015

New with 7.1 SP14: Dependency Diagrams (KPI trees)

$
0
0

With SAP Solution Manager 7.1 support package 14 another puzzle piece has been added towards a complete "Business Process Improvement Suite" as I would call it. While we provided Business Process Analytics as the root cause analysis tool in 2010 (meanwhile also available in an ad-hoc version and as native iPad app) and added a dashboard layer on top in 2011, we now shipped a new SAPUI5 application called "Dependency Diagrams".

 

As we ship such a vast KPI content out-of-the-box with close to 1.000 KPIs, customers can easily lose the overview on what is measured and tracked in what way and for what purpose. With the help of the Dependency Diagrams you can build kind of KPI trees where you bring the different KPIs into some logical, hierarchical order. You can visualize the dependency chains of KPIs and get the "picture" updated with your live data.

 

When you access the application you can select one diagram that must have been defined beforehand, e.g. looking at a diagram about the typical financial business KPI "Days Sales Outstanding (DSO).

 

Initial select.png

 

After selecting the diagram you get to see the root level (here Days Sales Outstanding itself) and the dependent level 1 KPIs. In our example all five level 1 KPIs are collected via Business Process Analytics, i.e. why you see an Analyze-hyperlink which allows a direct forward navigation into Business Process Analytics. The backend data from your SAP BusinessSuite or SAP S/4 HANA system can be shown as count of documents/items (e.g. 241 deliveries or 310 SD invoices) or as one accumulated monetary value (e.g. 2,7 million Euro or 311,3 million USD).

The tiles that you see could be

  • populated by Business Process Analytics data
  • populated by Business Process Monitoring data
  • populated via query from a connected SAP BW system
  • not populated with any data and just used for modeling/visualizing

 

Below SD Orders not billed and Open customer items we see a '+' icon, so we can further expand the tree in those areas.

 

Small tree.png

 

As we know that the majority of billing documents is created via billing run, we put another KPI below the SD Orders not billed. This additional KPI looks specifically at all error messages that were raised during billing due runs. One of the most typical error messages is the problem of incomplete sales order items and hence you can further expand to see Missing fields in SD documents.

Big tree.png

So with the help of this application you can

  • Bring business BW reporting and operational Business Process Analytics data for root cause analysis together in one view
  • Structure your KPIs in meaningful dependency diagrams, so that every manager and subject matter experts understands the leverage that one low level KPI might have on any (business) KPI on top, like DSO.
  • Set threshold values so that the numbers in the tiles get a green, amber or red rating
  • Decide if you put your focus on document numbers or monetary values (and which target currency)
  • Use the forward navigation to get from this overview into the actual root cause analysis with Business Process Analytics.

 

This application can be used on any device and any screen resolution, because of the SAPUI5 user interface.

 

Configuration

 

There are no diagram templates shipped with this application. Instead you have to model everything yourself. The configuration has some similarities to the configuration of Business Process Operations dashboards in SAP Solution Manager. So you have to create Analytical Key Figure Instances (AKFIs) first, where you define which data should be displayed from which source and define potential thresholds. Then you model the respective Dependency Diagram / KPI Hierarchy and bring the AKFIs into some logical order.

There is always just on root. After this you always just specify which AKFI is child of which other AKFI (identified by technical name not description). You can always have more than one child for every parent tile. Based on these parent/child relationships the application calculates the hierarchy level automatically and arranges the tiles from top to bottom.

Diagram setup.png

 

So if yo have Business Process Analytics already up and running in your SAP Solution Manager 7.1 with support package 14, then you should be only a few clicks away from creating your own Dependency Diagram and show it to your management in order to get the budget for your desired business process improvement activities.

 

Further reading

You can find all necessary information about Business Process Analytics in this document. More information on Business Process Improvement for SAP solutions can be found here.

 

Frequently Asked Questions about Business Process Monitoring and Business Process Analytics are answered under http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Monitoring and

http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Analytics respectively.

 

The following blogs (in chronological order) provide further details about Business Process Analytics and Business Process Monitoring functionalities within the SAP Solution Manager.

Enable Proxy settings SMP 3.0 SP07+

$
0
0

Overview

 

If SMP 3.0 is in an environment with no direct access to the internet and a proxy is required, you will need to perform manual steps do configure the proxy. This blog will walk you through all the required steps to enable proxy in SMP 3.0 SP07+ that requires a proxy to access the internet.

 

 

Environment

 

  • SAP Mobile Platform 3.0 SP07
  • SAP Mobile Platform 3.0 SP08
  • SAP Mobile Platform 3.0 SP09

 

I have found that in order to make the proxy work, you need to perform 2 steps.

Step 1 is to change the settings on the SMP Admin Cockpit.

Step 2 is to configure the props.ini file on the SMP file system and then regenerate the SMP 3.0 service.

 

 

Proxy settings on the SMP Admin Cockpit

 

Login to the SMP Admin Cockpit with your credentials.

 

Navigate to Settings, System and find the form to fill the proxy settings.

 

proxysettings.PNG

On the bottom left of the page, press Save. Restart the server after you make these changes.

 

Proxy settings on the SMP 3.0 Server (props.ini)

 

Login on your SMP 3.0 Server and follow the following steps to configure your proxy:

 

  1. Open the props.ini configuration file located in the following default installation <SMP-HOME-PATH>\MobilePlatform3\Server in a text editor
  2. Look for the following parameters,
    1. -Dhttp.proxyHost=
    2. -Dhttp.proxyPort=
  3. Add your proxy as shown below and save your file
    proxy1.jpg
  4. So if for example your proxy is called "myProxy" and the port is "8080", then that would look like this:
    1. -Dhttp.proxyHost=myProxy
    2. -Dhttp.proxyPort=8080
  5. If the secure https proxy is the same server, then the configuration would look like this:
    1. -Dhttp.proxyHost=myProxy
    2. -Dhttp.proxyPort=8080
    3. -Dhttps.proxyHost=myProxy
    4. -Dhttps.proxyPort=8080
  6. Once you are done, save your changes

 

Note: If your proxy settings require a username and password, use the following on the props.ini file.

    1. -Dhttp.proxyUser=<username>
    2. -Dhttp.proxyPassword=<password>

 

Note: any changes to the props.ini requires either stopping and starting the SMP server via the desktop shortcut, or if it is running as a service, the service should stop and the service configuration needs to be regenerated in order to make SMP 3.0 uses the new proxy configuration, If SMP 3.0 is not running as a service and it is running through the desktop shortcut, manually, then all we need to do is stop the server using the shortcut and restart the server using the shortcut.


Regenerating the SMP Mobile Service


To regenerate the service in order to make SMP use the new configuration from the props.ini file follow the following steps:

  1. Stop SMP 3.0 service if it is running as a service
  2. Open a command prompt as an Administrator
  3. Navigate to the following location <SMP-HOME-PATH>\MobilePlatform3\Server\bin
  4. Execute this svcutil -uninstall
  5. Execute this svcutil -generate
  6. Execute this svcutil -install
  7. Restart the SMP Mobile Service
  8. SMP 3.0 should be able to access the external resource using the proxy configuration

 

Note

 

For More information you can always refer to the  KBA, 1969316.

 

 

Summary

 

This article explained how to enable and configure SMP 3.0 SP07+  proxy using the props.ini file and how to regenerate the service for SMP 3.0


What happens when you decide to change your Data Model in IS-U

$
0
0

Well, there are more number of on-line resources about Master Data in SAP IS-U, thanks to those authors.

 

SCN Blogs:

SAP IS-U data model - Business and Technical Master Data

 

SAP wikis:

How to Create Technical Master Data - Utilities Industry - SCN Wiki

 

But, what happens when you decide to change your basic Data Model in your IS-U? wait, why do you want to do this? To give you a scenario, in one of my previous project the data model was not standard in first place, and we have to change this to standard best practise, and now you got why this blog!

 

By not standard I mean, initially, for each divisions i.e. for Electricity service and for Gas service, users were using individual Contract Accounts and also individual technical master data. This means duplicate of data and business processes. Now in this case we have to change the data model towards as standard SAP IS-U approach as possible with minimum impact to the current business process.

 

There could be many ways to change this Data Model, but I am not going into detail of each, as it depends on specific client/project situation/scale/time/money...etc, few methods are below:

 

A) Having a custom migration process (initially thought of one-off process) where we did updates to the master data tables.

B) For affected scenario, perform Move-out of one fuel and perform move-in (no need of much custom developments compared to A).

 

Each method will have its own pros and cons, needs further custom enhancements and logic to be implemented at various process within IS-U.

 

For the first method (A) we do need to consider all the business processes currently being used by the customer and design an improved process by suggesting the standard. Following are the processes which I could think of immediately:

 

1)  Move-in

2)  Move-out

3)  Change of Tenancy

4)  Billing of Contracts (no major change here...)

5)  Invoicing

6)  Invoice printing (this is the big one)

7)  Scheduling and Management of Meter Reading Process (no major change here as well)

8)  Changes in Payment Schemes (next big one)

9)  Payments and Returns

10) Collections and Dunning

 

Now coming to the technical part, yes, you got it right, most of the Technical and Business Master Data tables needs to be updated, to name few (and not restricted only to:) EANL, EUIHEAD, EVBS, EVER and other tables like EEINV, EAUSV, EBP_ADDR_SERVICE...etc.



Guide: How To Mass Upload attachments to Accounts in Cloud for Customer

$
0
0

Business Context

In Cloud for Customer projects the migration of data is an important aspect. C4C provides the tooling to migrate data from legacy systems into Cloud for Customer. When a customer is moving from their legacy CRM system to C4C, there is usually quite a history of customer information in the form of e-mail messages, contracts, or any other kind of documents. Obviously this history contains valuable information and needs to be available for salesreps to serve their customers.

Documents.jpg

Until recently you had to write your own program and make use of web services in order to mass upload documents and assign them to the accounts in C4C.

 

Recently SAP has added new functionality to C4C to mass upload documents and assign them, in this case, to accounts. This blog describes how to benefit from this Data Workbench functionality.


Before we can actually upload documents we first need to create an OData service to import the attachments.


 

Creating the OData Service


First log on as administrator in the Silverlight client.

 



Go to (Beta) Administrator-> (OData) Service Explorer

  • Select “Custom OData Services” from the “show” drop down
  • Create new OData service
  • Click on “New”
  • On quick create, give the service name and select the checkbox “Data Workbench Enabled”.02 acct attachment create.JPG

 

Click on “Select Business Object”

  • Give the Object and the node a name and click OK


03 object and root.JPG

  • Create Entities
    • Select the line “Root” and check the Root tick box, an entity will be added to  the “OData Service” list on the right

 

    • Expand the “Root”, and select Attachment Folder association, an entity (CustomerAttachmentFolder) will be added to the “OData service” list. This entity name will appear in the Data Workbench “Select an Object” list with the same name.


04 OData editor.JPG

  • Provide Scheme code.
    • Go to “ID Mapping” tab, select the scheme code ERP Account
    • Select the Parent external key i.e. property with name CustomerAttachmentFolderExternalKey
  • Expand the Customer AttachmentFolder entity.

05 ID mapping.JPG

  • Click on Activate


Creating the attachment upload file

 

  • The documents that you want to upload should all be in one zip file. The zip file must contain a manifest file that indicates which document should be attached to which customer in C4C. In this example I have three MS Word documents that I want to attach to three different customers06 zip.JPG


  • I have created a manifest filein Excel. The customers that I use in this file have been replicated from SAP ECC and have an external id. The first column contains the external id. The second column is the default Business Partner Document Type code. The third column is the name of the attachment that will be in the customer attachments tab in C4C.
  • Depending on the document, each document should contain their own MIME Type. You can find the valid MIME types in fine-tune activity “Allowed MIME Types for Document Upload”. In this example I want to upload three .docx documents so I choose: application/vnd.openxmlformats-officedocument.wordprocessingml.document
  • Save the file as manifest.mf and make sure it uses the UTF-8 encoding
  • Save all the documents and manifest file in one zip file


Mass upload of attachments

  • Go to the C4C HTML web browser (Only Google Chrome is supported) and Import the attachments via the Data Workbench Workcenter
  • On the IMPORT tab, Select “Import Attachment”
  • Select the “Business System ID”. (System from which the Business Object data was replicated, in this case the SAP ECC system
  • From “Select an Object” list, select the entity name created in step 4b above

09 Data Workbench.JPG

  • Click on next and confirm the "Successfully Submitted" message
  • Browse for the zip file and click on Upload and on Submit

 

10 Upload.JPG

Now all documents are uploaded to C4C and attached to the specific accounts. In the Data Migration Workbench go to the Monitor tab to see the status of the upload

11 monitor.JPG
Go to one of the accounts and click on the Attachments tab to see the uploaded document12 account.JPG
The following scenarios for attachment upload are supported by the Data Workbench:

  1. Import of attachment for any business object which was imported via the Data Workbench, just use the Import attachment by providing the valid external key.
  2. The import of attachments for any business object that was created directly in cloud for customer is not supported via Data Workbench, for example prospects
  3. Creation of an OData service is required as explained above
  4. Import of attachments for the business object that was replicated from any other external system like SAP CRM, SAP ERP, only master data objects like Employee, Customer/Account etc. are supported.
  5. The Data Workbench Workcenter is currently (in version 15.11) only supported in the Google Chrome browser.

 

I Am … An Intraprenuer

$
0
0

I Am An Intrapreneur.jpgInmy last post, we discussed the definition of an intrapreneur and detailed a few traits that can help you recognize intrapreneurs around you. It can be rewarding – and quite relieving – to recognize your inner intrapreneur. At least, it was for me.

For the entrepreneur-minded who are eyeing corporate roles, their skills may not be as applicable as you may think. They must be prepared to learn how to operate in a new grid, while growing and relying on skills that may not have been as important outside the corporate world.


An intrapreneur is not an entrepreneur

Let’s clarify something: Intrapreneurship is not synonymous with entrepreneurship. Yes, the definition of “intraprenuer” is an entrepreneur inside a larger corporation; however, it is more like being in a different country with a different language, currency, and governance.

Intraprenuers are often constrained much more with excessive consensus building, limited budget freedom, and mind-numbingly slow progression. There are overlaps among these realities, yet the strength and application of the intraprenuerial spirit often vary dramatically. Nevertheless, the common goal is to find a convergence point, where new value is realized and pain points are resolved.

Often, this ignites conversations in areas that many team members may find boring, distracting from current priorities, or far outside their organization’s responsibility. Also, going “off the grid” usually puts the business area on someone else’s grid to navigate complexity. The intrapreneur must be able to communicate in the company culture in the context of language, currency, and governance. And for the creative entrepreneur, this environment could be considered toxic.


The journey: Finding who you are

In a previous role, I was asked to move a rather large initiative at SAP, which taught me about the role of intrapreneurs within the concept-to-reality lifecycle. One of my key intrapreneur moments happened in the midst of a heated steering committee exchange. We were years into this initiative and solidly within the reality stage, generating tens of millions in revenue with the potential to rev up this e-commerce engine even further. While debating over go-to-market strategies and marketing approaches, I found very little reception of my ideas and proposals based on best practices.

That’s when it hit me. We were focused on a different challenge. We had built this amazing engine; yet, it still lacked many fundamental elements such as standardized processes, support agreements, and standard operating and IT procedures that governed the rest of the company.

At that moment, I realized two things:

  1. I continued to innovate
  2. My team was focused on stabilizing the operations

I became part of the problem. My skills were no longer bringing the kind of value needed to take the initiative into the operations phase. Once I realized what was happening, I immediately removed myself from my lead role. It was simply the right thing to do for my company and allowed me to find another way to channel my intrapreneur-self.

<Lesson #1:Assess your strengths and value add at different stages of your initiatives. You may just find that you need to be who you are on some other project.


It’s okay to have a net

On the flip side, many intrapreneurs crave – and thrive – within the stability of a corporate environment. Many find the insecurity of entrepreneurship overwhelming and, at times, unhealthy.

As my career matured, I began to recognize a yearning to follow my passions and make a social impact in business. I even quickly adopted the term “social enterprise” – but in reality, I was looking to go beyond talking about certain social issues and become part of a solution. At the time, I decided to venture out into the world of entrepreneurship. It was a tough and humbling experience. I found myself much more effective working within the stability of a corporate environment.

I’m still battling the comparisons of living the “big E” life-without-a-net lifestyle by working to fill skill gaps that those ventures uncovered in my personal development. There is still hope for me yet. Yet, I learned another good lesson…

Lesson #2:You don’t have to leave your job and launch a startup to make an impact.

Intrapreneurs are increasingly accelerating innovation, which is very relevant to the digital business conversation. As the hyper-convergence driving this transformation continues, we will continue to see major shifts in the required skills and capacities of employees along this journey. Plus, we should expect to witness a true convergence of the intrapreneur and entrepreneur worlds.

I’m curious to see how all of this morphs into a completely new type of employee-owner. My guess is that it already exists, and someone has already coined a term for it. Please pass along that information when you see it, and I’ll continue to search for a nice skills comparison. Still, I’m relieved to know that I am an intrapreneur and that it’s okay for me to struggle as an entrepreneur.

How to get the new ABAP editor in LSMW

$
0
0

SAP LSM Workbench (LSMW) has for years been a much used tool for data migration into SAP. It is both loved and hated :-) While LSMW has had its last update in 2004 it remains a much used tool in this age of more modern toolsets such as SLTSAP DS and the like.

 

For many frequent and hard core users of LSMW a big nuisance is the old style ABAP editor. This old editor takes up alot of development time, especially in those ABAP-rich LSMW projects.

 

One night, bored and out of beer, I managed to develop a relatively simple enhancement that enables the new ABAP editor for LSMW.

(Mangled code completion context list is thanks to Windows 10 & a 3K screen)


Compare that with what you have been working with for the last decades:

 

 

Features

  • New ABAP editor for all ABAP coding within LSMW (field mappings, events, user defined routines and form routines)
  • Code completion
  • Use of the Pretty printer
  • Use of the ABAP Syntax checker
  • Use of ABAP patterns
  • No Modification required, just a single implicit enhancement spot
  • Fix of a small LSMW bug where the wrong line is highlighted when doing a syntax check in the __GLOBAL DATA__

 

 

Limitations

  • Code completion is not aware of globally defined variables
  • A few, more exotic, editor menu commands are not working and will return 'Function not implemented'
  • The use of Edit-->Find/Replace issues warning and will eventually cause a short dump (but who needs this function eh?)

 

The enhancement

The implementation of the new ABAP editor takes just one single Implicit enhancement spot. No modification or any other unwanted hacking! It has been tested on an ECC 606 system with LSMW version 4.0.0 (2004) and SAP Basis 731/02.
Update: Also tested on a brand new ERP 740 SP12 on a 742 kernel with HANA DB underneath.

 

  1. Create an Implicit enhancement spot (how-to) at the start of Subroutine EDITOR_START of Function group /SAPDMC/LSMW_AUX_080



  2. Paste in the code attached to this post & activate.

  3. Create a user parameter ZLSMW_NEWEDITOR  in SE80 (how-to scroll all the way down). Assign the parameter with value 'X' to each user that wants to use the new editor. All other users will not be affected.

  4. Start LSMW!

 


Give it a try and inform me of any bugs. As stated above not all user commands work. All the important ones do and most of the others I have managed to catch and issue a friendly 'not implemented' message. Ideally I'd like to change the PF-Status of the screen and remove the unwanted commands, but this seems not that easy for now.



SDA Setup for SQLServer 12

$
0
0

Smart Data Access (SDA) is slightly older feature in HANA .I recently got a chance to setup for SQL Server12 . I read lot of documents/blogs/you tube videos but still running into issues and couldn’t implement in first go. It took me two or three iterations to get it installed correctly. I decided to put exact steps one should follow to implement SDA for SQL Server. I believe with these steps one should be able to install SDA for SQL Server in first go without wasting much time.


SDA Definition:

 

SAP HANA smart data access enables remote data to be accessed. It enables remote data to be accessed as if they are local tables in SAP HANA, without copying the data into SAP HANA. Specifically, in SAP HANA, you can create virtual tables which point to remote tables in from data sources. It is possible to write queries in HANA combining Hana native tables and virtual tables.


SDA Architecture:


fig_1.png


We are configuring HANA Server SH1 and connecting to SQLSERVER and another HANA Database.


Steps to Configure SDA :

 

a.      Linux Users

b.      Download / INSTALL unixODBC Driver Manager

c.      Download / INSTALL SQL Server Drivers

d.      SETUP .odbc.ini file

e.      Test

f.      Setup SDA for SQL server in HANA STUDIO

 


 


a. Linux Users

I am sure everyone knows about users but just wanted to reiterate .When you install Hana with root user it automatically creates a user <SID>User. If you don’t know the <SID>User you can run following command and know the <SID>User.


>> cat /etc/passwd

 

Root User:

 

With root user you will setup complete configuration of HANA System that including  UnixODBC and other ODBC drivers.

 

<SID>user:

 

<SID>User may not have permissions to setup configurations files unless he is given special permissions. Generally all HANA configurations are done with root user.

Hana Studio is owned by <SID>User  so some of the configurations you did as root user the others users in the system can’t see those configuration files .

 

 

b. Download / INSTALL unixODBC.2.3.0 drivers Manager.

      

Download the update the unixODBC driver to 2.3.0

 

http://www.unixodbc.org/


fig_2.png

 

 

Once you download this driver (unixODBC-2.3.0.tar.gz)  move it to HANA Server (can use FileZilla or other file transfer s/w) .Go to the directory when you have copied this file.

 

Login as root user and go to the directory when you copied the drivers.

>>  gunzip unixODBC-2.3.0.tar.gz

>>  tar xvf  unixODBC-2.3.0.tar

>>  ls 

 

Will show you the unixODBC-2.3.0 folder , type the following command to install Unixodbc drivers .

 

>> cd unixODBC-2.3.0

>> ./configure

>> make

>> make install

 

Once the making of binaries are done check if the drivers installed properly.

>> isql –version

The output should be unixODBC 2.3.0


c . Download /Installing SQL Server Drivers :

 

Download MS SQL SERVER Driver for Suse Linux:


(google for other flavors of Linux drivers for SQLSERVER)

 

http://www.microsoft.com/en-us/download/details.aspx?id=34687


fig_3.png

Once you download , copy the SQL server drivers to HANA Server ( May be at same location where you downloaded /installed UnixODBC file using filezilla  or other file transfer software)

 

INSTALL MS SQL server drivers:  unzip the file:

 

Login as root user

>> gunzip msodbcsql-11.0.2260.0.tar.gz

>> tar –xvf msodbcsql-11.0.2260.0.tar

 

This will install a folder with msodbcsql-11.0.2260.0 and also it will install in /mnt file directory

to check whether it installed SQL server drivers properly check the following path sql odbc directory.

 

>> cd /mnt/drivers/msodbcsql-11.0.2260.0

 

Verify the version and install:

 

>> ./install.sh verify


fig_4.png


>> ./install.sh install


fig_5.png

Check again to see if it’s installed.

 

>> ./install.sh verify

fig_6.png


d. SETUP .odbc.ini file

 

Check the odbc.ini file by using the command

 

Login as root user

 

>> odbcinst –j

 

fig_7.png


Odbc.ini file is visible at two locations .One at root directory and one at /etc/unixODBC/odbc.ini

 

Note: if you want all the users to have access to .odbc.ini file then add config in root user and add entries in “/root/.odbc.ini” file

 

 

>> vi  /root/.obdc.ini       -- Open the file and enter following and save it

 

----------------------------  MS SQL SERVER DRIVERS --------------------------

[MSSQL]

Server=SQLSVR,1433

Driver=/opt/microsoft/msodbcsql/lib64/libmsodbcsql-11.0.so.2260.0

 

----------------------------------------------------------------------------------------------

 

 

e.  TESTING

 

Login to Hana Server with Root User (in Putty)  and test connectivity for Sql Server :

 

Login As root user

>> isql  -v  MSSQL sa  Welcome1


fig_8.png

This works for root user .


f.   Add data source from HANA studio

 

fig_9.png

  1. Error : SAP DBtech JDBC L403) Internal error Cannot remote source objects [unixODBC] Data Source name not found and no default driver specified.

 

Is you see the above error which tells there is a problem with unixODBC drivers but the error message is misguiding .This means the  HANA Studio cannot  see some of the configuration files in HANA HOME Directory.Remember .Hana Studio is owned by <SID>USER..

 

Do the following

 

Log in HANA Server as root user in putty

 

>> cp /root/.odbc.ini  /usr/sap/<SID>/home    (ex:  cp /root/.odbc.ini  /usr/sap/SHI/home)

 

Logout of Hana Server and come to HANA Studio and try to connect SQL Server .I got success when I do this process.

 

   

Configuring  Other Hana Server (Server sid name : FDC )

 

Login to HANA Server (SH1 ) as root user

 

>> vi /root/.odbc.ini

----------------------------  MS SQL SERVER DRIVERS --------------------------

[MSSQL]

Server=SQLSVR,1433

Driver=/opt/icrosoft/msodbcsql/lib64/libmsodbcsql-11.0.so.2260.0

 

--------------------------------- HANA SYSTEM (FDC) ---------------------------------------

Server=FDC: 30015

Driver=/usr/sap/FDC/hdbclient/libodbcHDB.so

 

>> cp /root/.odbc.ini  /usr/sap/SHI/home ( For HANA Studio to see this file )

 

 

 

Connecting Other Databases:  Will update this section when I configure .

 

 

 

Conclusion:

 

Now we have connected to SQL Server and different HANA Server. This worked for me and hope it works for you as well in your first go. Whenever I add other databases as sources I will update the process with my findings in this blog and thanks for reading my blog.

Transit warehousing scenario addressed in SAP TM

$
0
0

This article of mine will provide an overview of transit warehousing scenario which is implemented in SAP TM  9.3 along with SAP EWM 9.3.This scenario is not available for a shipper scenario but only for a freight forwarder/LSP scenario where warehouse operations and execution is implemented in SAP EWM and is integration with freight forwarding operations using SAP TM.

In transit warehousing scenario, the freight forwarder takes the responsibility of receiving cargo from shippers based on inbound transportation planning in SAP TM.Then the cargo is consolidated in such a way that multiple customers based on same destination locations are grouped together and transported to a transit warehouse and then its shipped to the next location of the transportation chain and finally delivered to consignee based on outbound transportation planning in SAP TM.

Hence major implication of this process is that both inbound and outbound transportation planning is possible using this process.Also the cargo received from multiple shippers is very variable hence cargo is managed as a handling unit and cargo information is directly mentioned in the documents in the warehouse.

In the warehouse for a transit scenario, the structuring of warehouse is done in such a way that HU with same destination location are grouped together in the same storage bin .However exception to this are the HU which are dangerous goods or high value cargo which are put in separate storage bin.

Then based on the outbound plan sent by the transportation planner in SAP TM, the HU are loaded directly from the staging areas which is then loaded on to the truck  for putaway operations.There are lot of data exchange that occurs between SAP TM and SAP EWM since SAP EWM informs SAP TM about various steps performed in transit warehouse like arrival in checkpoint, departure from checkpoint , loading and unloading completion etc.

To implement this scenario there has to be proper warehouse organisation structure and warehouse master data to be set up.The warehouse structure comprises of :

a) Warehouse area with storage bins for postal codes

b) Warehouse area with country and region speciifc storage bins

c) Warehouse area for handling HU containers

 

Below enclosed is the process flow to be used for implementing this scenario:

a) Create a forwarding order in SAP TM-Here the customer service creates a forwarding order so that cargo can be transported from shipper to consignee.Once the cargo is picked up from the shipper premises, its to be transported via transit warehouse which belongs to the transportation network of the freight forwarder and belongs to SAP EWM. The output of this step is creation of freight unit.

 

b) Create freight order as part of pick up stage-In SAP TM based on transportation planning, freight order is created during the pick up stage and the stage detaisl will show the pick up from shipper location to transit warehouse.At this time the carrier is assigned to this order.

 

c) Freight order status been upated in SAP TM-Here once the cargo is picked up from shipper location , then the freight order status is updated to Departed.When transportation planning is completed the transportation planner sends an unloading request to the SAP EWM containing details of the freight order number.SAP EWM automatically creates a TU ( transportation unit) , inbound delivery and HU whereby HU are assigned to TU.

 

d) Then truck arrives at the checkpoint and informs the warehouse clerk about the freight order No.The warehouse clerk assigns free door to the TU and updates the status-Docked at door and informs the trucker to drive to assigned door.During this time an unloading notification message is send by EWM for arrival at check point and TM freight order status is updated to Arrived.

 

e) Warehouse clerk then receives the packages once the unloading process starts.The HU are received using RF device and once packages are identified ,HU label is attached to those packages .System automatically performs Goods receipt in EWM for each HU.

 

f) Once all HU are unloaded the truck leaves the door and the warehouse clerk confirms the departure from checkpoint.During this time the unloading message for departure is triggered by SAP EWM which updates freight order status to departed.

 

Now we shall follow the standard process to ship the product from the transit warehouse to next location in transportation chain.After creation of pre carriage/pick up freight order, SAP TM send the outbound planning information to SAP EWM and the same process repeats again as explained above like creation of freight order,send loading request, truck arrival at checkpoint, staging handling unit, loading truck, truck leaves,

Web Dynpro versus SAPUI5

$
0
0

Authors

                     /profile/5swBO64dnbgAIpsuPcOFAF/documents/uF8Pjj4zi8tAc7hwfrcXXL/thumbnail?max_x=850&max_y=850                                         http://avatars.wdf.sap.corp:1080/avatar/I304251

Vijaya Kumar Veeraraj (I068092)           Vishal V K (I304251)


Introduction

Software is an integral part of today’s world. “Mobile while mobile” concept through smartphones, tablets and other devices has made it easier to access applications and services in private as well as in business life. Such a ubiquitous mobile world demands a better user experience for business applications as the software would be more desirable to the end user.

The two predominant SAP technologies used for development of User Interfaces are Web Dynpro and SAPUI5. Both of which have their own advantages and limitations. Being in the SAP world, sometimes we face a situation to choose between the two technologies. Here we have attempted to provide a comparison between Web Dynpro and SAPUI5, by taking into consideration the scenarios where we use them, performance and the technical details involved. This helps the reader to compare their requirements for the software and then decide whether to use Web Dynpro or SAPUI5.


Overview

Web Dynpro

  • Web Dynpro is a programming model provided by SAP. Web Dynpro is implemented using Java and ABAP. It is well suited to generate standardized user interfaces(UIs) and minimize the time needed to realize Web applications.
  • Web Dynpro applications are built using declarative programming techniques based on the Model View Controller (MVC) paradigm.
  • Used for creating desktop applications.

SAPUI5

  • SAPUI5 is a JavaScript-based UI library which is designed to build cross-platform business applications. It combines new qualities such as openness, flexibility and high speed of innovation with known SAP strengths like enterprise readiness and product standard support.
  • It also has a powerful support to theming based on CSS.
  • SAPUI5 applications can run both on desktop as well as mobile devices. sap.ui.commons library controls are used for creating desktop applications and sap.m library controls are used for creating mobile applications.
  • SAPUI5 also provides application developers to create new UI libraries and custom controls. This way, UI5 development groups will not become a bottleneck for application groups who need a certain functionality.

 


Architecture

Web Dynpro

/profile/5swBO64dnbgAIpsuPcOFAF/documents/hphdayiz4rfp2fJbHNytDi/thumbnail?max_x=850&max_y=850

The above diagram shows the architecture of a Web Dynpro application and   it is based on the Model View Controller paradigm.

The model forms the interface to the back end system and thus enables the Web Dynpro application access to back end data.

The view is responsible for the representation of the data in the browser.

The controller lies between the view and the model. The controller formats the model data to be displayed in the view, processes the user entries made by the user, and returns them to the model.

Views and Controllers form a 1:1 relationship. It is also possible to have View less component and there is no need to have windows. In this case, the component does not implement an interface view. Components without any visual interface are called faceless components.

 

                                           /profile/5swBO64dnbgAIpsuPcOFAF/documents/efAjfVY2cVjcJOFSoBPcvH/thumbnail?max_x=850&max_y=850

An application is an entry point into a Web Dynpro component. Web Dynpro applications may be embedded in the portal environment, they may be displayed by the SAP NetWeaver Business Client, or they may be displayed by a browser.

SAPUI5

                                        /profile/5swBO64dnbgAIpsuPcOFAF/documents/Up1uL3OMtzB6jPJacwqLJI/thumbnail?max_x=850&max_y=850

It follows the MVC architecture. On the client side, Views are responsible for defining and rendering the UI. The App View is the top-level view, which contains the other view. The Model is either in JSON or XML, manages the application data. I18N Model is used for locale-dependent texts, OData Model for data retrieved from the backend using OData services, and device Model for device specific data required during runtime. Along with these, the developer can create named models also. The Controller reacts to the View events and user interaction by modifying the View and the Model.

                              sapui5 architecture2.png

The above diagram gives an overview of the basic files for every SAPUI5 application.

  • manifest.json: The manifest.json file is used to configure the app settings and put the important information needed to run the application. Configurations include information regarding the models, routing etc. Using this file not only helps you to write less application code, but also ensures that you can access the information needed before the app is instantiated.
  • Component.js: The Component.js implements the logic to create the application’s root view (App view) and the model instances. The component container loads the component when the app is started. The SAP Fiori Launchpad acts as the Component container. For standalone applications, the index.html file contains the component container with a reference to the component.
  • Root view: The App.view.xml defines the root view of the application. In most applications, the App.view.xml contains App or SplitApp as the root control.

 

Required skill set

Web Dynpro

The main skills that one needs to have for creating WD applications are

  • ABAP OO : Used for Web Dynpro ABAP
  • JAVA : Used for Web Dynpro JAVA


SAPUI5

The main skills that one needs to have for creating SAPUI5 applications are

  • JavaScript: Used for creating the Views and Controllers.
  • XML: Used for creating the Views and Fragments.
  • CSS: Used for theming and adding custom styles in the Views.
  • Gateway and OData services: Integration with backend systems. The purpose of the Open Data protocol (OData) is to provide a REST-based protocol for CRUD-style operations (Create, Read, Update and Delete) against resources exposed as data services. An understanding of the metadata file of a service and consumption of OData services is essential.



Key features

Web Dynpro

  • Web Dynpro ABAP is integrated flawlessly into the SAPGui Workbench (SE80) and it does not require open source add-ons like Eclipse for development.
  • Web Dynpro is generally used for Server-side rendering and server-side runtime environment into which many dedicated "hook methods" are available. The developer places his own custom coding within these hook methods in order to implement the desired business functionality. These hook methods belong to one of the broad categories of either "life-cycle" or "round-trip"; that is, those methods that are concerned with the life-cycle of a software component (processing that takes place at start up and shut down etc.), or those methods that are concerned with processing the fixed sequence of events that take place during a client-initiated round trip to the server.
  • Stateful applications are supported. If the page is changed, the required data remains intact so that you can access it at any time throughout the entire application context.
  • Web Dynpro is developer friendly as it provides wizards for the definition of forms and tables in the UI and source code in the controller methods, and also for other reusable components. Thus repetitive tasks of UI coding would be eliminated, thereby reducing the development time and effort significantly.
  • Anyone with basic ABAP knowledge will be able to create Web Dynpro applications with ease.


SAPUI5

  • SAPUI5 uses standards-based Web technologies to build a bridge between Web applications and mobile devices. HTML5 UIs with a native look and feel can be created, and the run on any device including tablets and smartphones. Both desktop and mobile applications reuse the same core library and the same Model and Controller implementations.
  • Build beautiful HTML5 UIs with a modular control library for both desktop and mobile applications. It uses standard controls such as Value holders, Layouts and Dialogs. It takes user experience to the next level with various UX controls such as Shell, ThingInspector, etc. It also includes light analytical patterns by using graphics based on SVG or Canvas.
  • SAPUI5 also has support for theming capabilities to design beautiful UIs which fulfill user requirements. To separate structure from layout, SAPUI5 uses CSS3-based techniques allowing you to change the visual design of your applications without any modification.
  • Applications fetch data from backend ABAP or HANA systems by connecting with the help of OData services. The data fetched is either in JSON or in XML format.
  • SAPUI5 applications are stateless. In the mobile world, connectivity to the application server is not very stable. The server must not hold a state, which may be lost when the connection is lost, or which leaves objects in a locked state, as the transaction cannot be completed. It is acceptable to read data somehow informing the client that the data is not up-to-date as someone else has changed it (optimistic locks). The OData-protocol implies such a stateless behavior. The frontend developer can simply rely on the model to which they bind the UI to be stateless, but the backend-developers have to to fulfil the provisioning side of the protocol.


Advantages

Web Dynpro

  • When business applications are mapped to an ERP system, many complex standard applications are created whose user interfaces do not necessarily meet individual requirements. With the Web Dynpro ABAP configuration framework, you can adapt Web Dynpro applications without having to change the source code in the application.
  • Code-free UI configuration is possible in Web Dynpro ABAP, since developers can adapt Web Dynpro components for the whole system, and define which settings can be overwritten in customizing or personalization. Developer can define the configuration at design time in the configuration editor. Administrators can use this editor to adapt a configuration in the customizing layer. Thus, we are maximizing design and minimizing coding effort.
  • End users can personalize a Web Dynpro application at runtime. User adaptation options are restricted to functions that do not affect the running of an application. At application runtime, for instance, user can move table columns, hide UI elements using the context menu, or set default values for input fields. These adaptations are persisted implicitly, and the user is not requested to save these changes.
  • Modification-free adaptation and extension of UIs can be done in Web Dynpro applications which are integrated with Floor Plan Manager and the Business Object Processing Framework (BOPF).


SAPUI5

  • SAPUI5 applications can be rendered both on the desktop and mobile devices thus providing more agility through availability on any device & for any platform.
  • SAPUI5 is a library built on top of JavaScript and jQuery. JavaScript itself being a loosely-typed language, it has no fixed data types. JavaScript is also object oriented and hence, each controller can be visualized as a class and each function of the controller as a class method.
  • Enhanced User productivity and better user experience through increased flexibility, openness and pixel perfect design.
  • Integrated deployment of applications to SAP platforms. ABAP development tools and SAPUI5 ABAP Repository Team Provider plugin have to be installed in the Eclipse IDE for running applications on ABAP server.
  • Application performance is faster due to client side events. Responsiveness of applications is faster. As HTML content size is minimal, rendering on the browser is also faster.
  • SAPUI5 supports data transfer for all JSON, XML and oData formats. The MVC architecture ensures that there is no / minimal changes required in the application, if the data format is changed.
  • Ability to customize and extend existing Controls. This provides an added advantage to the developer to create controls according to their requirements without having to wait for the application groups to provide the functionality.
  • Open source plugins and libraries can be easily integrated with SAPUI5 applications.
  • Theming features based on CSS3 help to provide a beautiful and pleasant UI.
  • The frontend development can be separated completely form the backend development, thus making parallel development possible.
  • Canvas and SVG are helpful for animation and graphical representation with Charts, etc. Thus it also provides Analytical capabilities which can be utilized when integrated with HANA as backend.

Hello World

$
0
0

Hello,

 

this is my first Blog Post.

PRE/POST XSLT step in integration scenario via PI

$
0
0

I decided to make some more advanced integration and write this separate Blog post based on the comments to the first Blog that I wrote about ME integration via PI http://scn.sap.com/community/manufacturing/blog/2015/12/12/hands-on-sap-me-integration-or-how-i-learnt-pi-xi

 

The main feature that I discovered since then, is that it is possible to define more than 1 mapping step and thus implement PRE/POST XSLT steps.

In my sample scenario the first step is a Message Mapping step (iDOC to iDOC) that selects only needed tags from the incoming iDOC

FirstStep.png


and makes a transformation of one of its fields (Description) based on the string concatenation rule:

modify iDOC.png


So, the inbound scenario I implemented looks like this:

 

iDOC is sent from ERP->received by PI->as PRE XSLT step, the initial iDOC is converted to another one with modifications using the capabilities of PI->the modified iDOC is transformed by the XSL to the SOAP request->the SOAP request is received and processed by SAP ME

 

 

Here is a screenshot of the imported Material with 2 times modified Description field. Prefix Alex+ was added by the XSL transformation, suffix +IDOC2IDOC was added at PRE XSLT step:

ImportedMaterial.png


The main point was, of course, not to modify the iDOC before XSL transformation, but actually to show that it is possible to define the PRE or POST XSLT steps. Since it is possible to define a Java Class at PRE/POST XSLT step to be executed instead of a Message Mapping, any custom code can be executed within it (not necessarily for mapping purposes). Just at the end it should return the message to the next mapping (in this case XSL transformation) step.

 

 

I am looking forward for your comments. Your comments to my first PI Blog helped me to study PI deeper and in a right direction, so this Blog on how to implement PRE/POST XSLT steps was born.

 

I also hope you find information about ME integration via PI as much interesting as I do. Your Likes and Ratings would be appreciated.

 

 

Best  regards,

Alex.

Simple Insurance... are we there yet?

$
0
0

“Run Simple” has been a very catchy phrase since its inceptions as SAP’s new theme which rolled out at its annual SapphireNow user conference in Orlando as a brand message SAP adopted way back in June 2014. A very good strategy adopted by SAP to convey its message to its existing customers and to attract potential customers. This gave brownie points and an edge over its competitors I must say and it works and works quite well! Everyone loves it.

 

Now comes the not-so-simple part, different core application in SAP for Insurance. In the “SAP for Insurance” world we have this value map which demonstrates how different core processing application modules like FS QUO, FSPM, FSCD, ICM (cross industry solution), FSCM, and FSRI can be used to take care of Core Insurance Operations. This is the current landscape model which is being projected as Customer Centric model.

 

 

Current Customer Centric Model.jpg

 

This customer centric model having different core insurance application, has certain limitations;

 

  1. Complex landscape – if you notice the applications are based on the lifecycle of an Insurance policy. For an Insurance policy each application kicks in depending upon the stage of the insurance policy.
  2. Data footprint is large as there is header and detailed table structure of the database. Data models of each application have a varied design.
  3. No Real-time processes – Data has to be pushed or pulled through different applications.
  4. Batch jobs take extensive time to complete. Failed batch jobs are a night mare for Insurers
  5. Data replication, Duplications, Aggregates are common because of the complex landscape.

 


Simple Insurance – A new vision

 

This customer centric model can be remodeled to make SAP for Insurance, simple. Now we have a new catch line “Simple Insurance”.

How can this Simple Insurance model be structured?  I want this model to be as simple as it sounds and which suits the title Simple Insurance. I want the best of what the entire existing SAP core insurance applications have in a single carton based on real-time analysis of financial and operational business scenarios. ERP-system-from-scratch for different SAP Insurance components (and then putting in time and resources to integrate them) is a Big No.

Here is what simple Insurance should be devised of, and what every Insurers would prefer to have,

  1. Simpler landscape with less interfaces to other SAP/ Non SAP system components
  2. Smaller data footprint when it comes to management of data
  3. Real-time processes across lifecycle of a policy
  4. No hassles related to batch jobs / time and resource consuming batch jobs
  5. No data replication, no duplications and no aggregates

 

What the Insurers are looking for are;

From user’s perspective

  • Simple administration – Insurers are interested in having a user handle the entire lifecycle of a policy of a customer at different stages. If the simple insurance landscape allows this then customer service touch point provided can close the gap between the customer/ policyholder and the Insurer.
  • Simpler user experience - for both the online customer and the Insurer. No one wants to run door to door to get their requests attended / queries answered. Instead a single window to handle such things is very much convenient. In the same way no one would prefer to navigate to different SAP applications to get things done. It would be wonderful to have a single window screen which is much user friendly rather than referring to some t-code cheat sheet.

 

From IT management perspective

  • Simple to implement – Minimal use of tools to guide and configure the application. And less time to go live.
  • Simpler development – From an IT management perspective, usually an enhancement in one of the core applications triggers enhancements in other dependent SAP core application(s). Each of these core applications being at a different version of enhancement pack makes the whole IT development process crumble some.

 

Here is what the modern SAP Insurance landscape would look like with HANA for real time analysis;

Capture landscape.JPG

 

“Simple Insurance” can be modeled as shown below is by having a simple architecture where we have core systems acting as a unified application.

unified Capture landscape.JPG

For e.g.; we have this SAP Policy Management (FS-PM) where the Insurer can control the whole life cycle of a contract, starting from the creation of an application, through policy issuance and ongoing contract maintenance, and up to the termination of the contract with the Insurance customer.

For this SAP Policy Management (FS–PM) provides interfaces for the integration of other SAP Insurance components. Taking some of the SAP components which are mentioned below as an e.g.:

  • Collections and Disbursements (FS-CD)
  • Claims Management (FS-CM)
  • Incentive and Commission Management (FS-ICM) and Portfolio Assignment (PFO)
  • Reinsurance (FS-RI)

Now these SAP Insurance components have their own architecture to manage the respective Insurance business process with its own database. It is like the Insurer has to implement each SAP insurance components as a separate ERP. In the figure below I have just considered FS-PM’s interfaces to the other main SAP insurance components while there can be many and with non SAP systems as well.

Discrete landscape.JPG

The Simple Insurance architecture can be modeled to move from this architecture to a more simplified and robust architecture which shall meet with the consistently improving infrastructure ( like in memory based database), and also use scalable options that can help Insurers meets future requirements.


This simple insurance architecture can portray as single ERP and face to the Insurer. This can be achieved by having a single component as shown below.

 

Integrated component.JPG

 

As the first step towards this simplicity, the core SAP insurance components can be concentrated upon and then the other SAP components like Finance and Risk, Procurement, HR & Investments can be looked at. Integration with the Cloud based solutions shall be an additional bonus.

 

Right now the concept of Simple Insurance is just an idea which Insurers would like to see as a reality in the future. Thus making the concept of having various complex SAP components knit together redundant. Instead have a common architectural framework with all the SAP components bundled in one unit.

16 Perspectives to Help Shape How You View Digital Transformation in 2016

$
0
0


DT Webcast Series Image Digital-transformation_shutterstock_F.JPG


Every company and every industry is facing change on a scale never seen before meaning all organizations must quickly determine how to respond to and profit from all that change, in other words, formulate a digital transformation strategy and action plan. This is why the Americas' SAP Users Group ( ASUG ) is creating several series of webcasts and other content to help members in that endeavor.


 

Digital transformation was one of the most popular buzz phrases in 2015 and is likely to be so again this year, but how best to define it? Jonathan Becher, chief digital officer at SAP, provides an accurate definition in a recent blog:


 

“…every successful digital transformation has the following three elements: A new customer experience, a new business model, [and] a new value creation model.”


 

But that is not all that we are encountering this year. The notion of the “Fourth Industrial Revolution” is beginning to gain some traction – especially as over 2,500 leaders from business, government, international organizations, civil society, academia, media, and the arts participate in the 46th Annual Meeting of the World Economic Forum (WEF) in Davos-Klosters, Switzerland. This year’s summit is focusing on this new revolution as a fusion of technologies blurs the lines between the physical, digital, and biological spheres.


 

Digital Transformation and the Fourth Industrial Revolution: What it Means for You


 

The pace of change today is also unprecedented. Get used to it – things will never move so slowly again. Whether you call it “digital transformation” or the “Fourth Industrial Revolution,” these forces will greatly change our world and our lives in ways that we are only just starting to imagine and understand.


 

Among ASUG members, there is already much discussion about what this all means and how we can successfully navigate this world together. Not only do we have to work differently than before, but we also need to engage in thoughtful conversations and innovation to make sense of this onslaught of transformation at a faster pace and in a more coordinated fashion.


 

This is why we are pleased to announce our four-month “ASUG Perspectives on Digital Transformation” webcast series, starting Feb. 2.


 

ASUG Perspectives on Digital Transformation Webcasts -- SAP, Analysts, Partners, User Groups


 

We are creating a new ASUG Digital Transformation Program for our members that will serve as a “one-stop shop” for all things related to digital transformation and the Fourth Industrial Revolution across lines of business, industries, and technology areas. This webcast series will feature visionaries and thought leaders who will offer their insights and help you prepare to succeed in new ways in this increasingly disruptive world. To continue the conversation on this theme, we will engage members and customers during ASUG chapter meetings and SAP events and through blogs and social media.


 

During our first wave of webcasts, running Feb. 2 to Feb. 29, SAP executives will share their perspectives on digital transformation. Over the course of 16 webcasts over 28 days, ASUG members will hear from an impressive line-up of global leaders from SAP. At the end of the month, we will hold a special 90-minute session titled “The SAP HANA Platform – Powering the Digital Transformation” which will include a look at the results of our recent ASUG SAP HANA adoption survey.


 

Here are all the 16 exciting topics which will be explored:


 

Feb. 2: Enterprise Architecture: Architecting Your Digital Business of Tomorrow - Irfan Khan, Chief Technology Officer, Global Customer Operations, SAP


 

Feb. 4: Utilities: SAP’s Digital Utility Framework – Reimagine Your Utility - Henry Bailey, Vice President, Industry Business Solutions, SAP


 

Feb. 9: Chemicals: Digital Transformation in the Chemicals Industry - Don Mahoney, Vice President and Global Head of Chemicals, Industry Business Solutions, SAP


 

Feb. 11: Public Sector: Frictionless Public – Provide, Protect, and Prosper in the Digital Society - Isabella Groegor-Cechowicz, General Manager, Public Sector, SAP


 

Feb. 11: HR: Architecting HR to Meet the Needs of a Digitally Transformed Workforce - Mike Ettling, President, SAP SuccessFactors


 

Feb. 16: Asset Management: Rethinking Asset Management in the Digital Age - Achim Krueger, VP, Operational Excellence Solutions and LoB Asset Management, SAP


 

Feb. 17: Supply Chain: Digital Transformation in Supply Chain - Martin Barkman, VP, Head of Demand-Driven Business Planning, Extended SCM, SAP


 

Feb. 18: Oil & Gas: Inspire and Shape the Digital Energy Revolution - Ken Evans, Global Head Oil & Gas Industry Business Unit, SAP


 

Feb. 19: Digital Education for the Digital Transformation - Bernd Welz, Executive Vice President and Head of Scale, Enablement and Transformation, SAP


 

Feb. 19: Finance: Digital Transformation in Finance - Thack Brown, General Manager, Global Head LoB Finance, SAP


 

Feb. 22: Discrete Manufacturing: Digital Transformation in Discrete Manufacturing Industries - Stefan Krauss, General Manager Industry Cloud – Discrete Industries, SAP


 

Feb. 23: IoT: Connect, Transform, and Reimagine – SAP and the Internet of Things in 2016 - Nayaki Nayyar, General Manager and Global Head of IoT and Innovation GTM, SAP


 

Feb. 24: User Experience: Digital and IT Transformation with Design - Sam Yen, Chief Design Officer, SAP and Managing Director, SAP Silicon Valley


 

Feb. 26: Automotive in a Digital World - Holger Masser, Global Head of Industrial Business Unit Automotive, SAP


 

Feb. 29: The SAP HANA Platform: Powering the Digital Transformation - Matthew Zenus, Solution Manager, Data Warehousing Solutions and SAP HANA, SAP


 

Feb. TBD: Soon-to-Be-Revealed Special Digital Transformation Presentation - TBA Executive at SAP


 

 

And That’s Not All

 

 

 

We will follow this first wave of digital transformation webcasts with a second and third wave delivered by analysts and SAP partners. Last, and certainly not least, you can expect ASUG CEO Geoff Scott to weigh in on the topic of digital transformation on a future webcast.


 

As entrepreneur and tech executive Douglas Merrill once said, “All of us is smarter than any of us.” We need to work together to bring the power of digital transformation in the Fourth Industrial Revolution to our organizations. I hope that you can join us on this exciting journey.


 

Register herefor any or all of our February ASUG Perspective webcasts. Any questions, please contact ASUG Community Advocate Paul Kurchina at paul.kurchina@asug.com.


 

Please note: We are also opening up the entire February series of digital transformation webcasts to non-ASUG members. Registration for non-ASUG members will be available starting Jan. 27.


Building your framework to work with SAP B1 - Part 1

$
0
0

Hello guys, in the day to day of development, any code that is written more than one time, should receive an special attention. We should try abstracting them, so that, it doesn't need be rewritten again.

Then I will share with you some classes that I created to facilitate my day to day with development in the SAP B1.

In this first text, I will show one class that implements the basics database operations as add, update, delete and "get by key" for an User Defined Table(UDT) of type not object.

 

Notes:

 

1. To understand better the operation of the class presented, you should be familiarized with the classes and methods presented in the system.reflection namespace. See here: System.Reflection Namespace

2. This class is not prepared to work with UDT of type "Master Data (Rows)" or "Documents (Rows)".

3. For development I use .NET Framework 4.5 and VS 2015. Maybe some functionalities can be unavailable in older versions.

 

The class that I call UDTModelBase(explanation in comments of the class):

 

using SAPbobsCOM;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection;
using System.Runtime.InteropServices;
using System.Text;
using System.Threading.Tasks;
namespace ConsoleSample
{    /// <summary>    /// Note¹: The property in your class should have the same name that your field in database and the same type.    /// Note²: Where you see Program.oCompany replaces with your Company object    /// </summary>    public abstract class UDTModelBase    {        #region Private properties        /// <summary>        /// Prefix of SAP database fields        /// (!) Note: SAP added automatically "U_" in front of name of all user defined fields.        /// </summary>        private const string PREFIX_FIELD = "U_";        #endregion        #region Public Properties        /// <summary>        /// Code of your object, should be unique.        /// (!) Note: SAP creates this field automatically when you create a user defined table.        /// </summary>        public int Code { get; private set; }        /// <summary>        /// Name of your object, should be unique        /// (!) Note: SAP creates this field automatically when you create a user defined table.        /// </summary>        public string Name { get; private set; }        #endregion        #region Private Methods        /// <summary>        /// Gets the value from an property, and converts in a sap type field        /// </summary>        /// <param name="type">Type of SAP field</param>        /// <param name="property">Property that should be extracted the value</param>        /// <returns></returns>        private dynamic getValue(BoFieldTypes type, PropertyInfo property)        {            switch (type)            {                case BoFieldTypes.db_Alpha:                    if (property.PropertyType.Name.Equals("Boolean"))                        return property.GetValue(this, null).ToString().Equals("True") ? "Y" : "N";                    else                        return property.GetValue(this, null).ToString();                case BoFieldTypes.db_Numeric:                    return (int)property.GetValue(this, null);                case BoFieldTypes.db_Float:                    return Convert.ToSingle(property.GetValue(this, null));                case BoFieldTypes.db_Date:                    return property.GetValue(this, null);                default:                    break;            }            return null;        }        /// <summary>        /// Set the value to a property        /// </summary>        /// <param name="value">Value that should be set to the property</param>        /// <param name="prop">property that receives the value</param>        /// <param name="owner">owner instance of the property</param>        private void setValue(dynamic value, PropertyInfo prop, dynamic owner)        {            if (prop.CanWrite)            {                if (prop.PropertyType.IsEnum)                {                    prop.SetValue(owner, Convert.ChangeType(value, Enum.GetUnderlyingType(prop.PropertyType)), null);                }                else if (prop.PropertyType.Name.Equals("Boolean"))                {                    prop.SetValue(owner, value.Equals("Y") ? true : false, null);                }                else                {                    prop.SetValue(owner, Convert.ChangeType(value, prop.PropertyType), null);                }            }            else            {                //Code and Name has "private set", and then, they are defined here                typeof(UDTModelBase).GetProperty(prop.Name).SetValue(this, Convert.ChangeType(value, prop.PropertyType), null);            }        }        /// <summary>        /// Get a new code to the object        /// </summary>        /// <returns>Returns a new code</returns>        private int getNewCode()        {            int maxChave = 1;            Recordset oRs = null;            try            {                //Do a query to convert Code in an int and get the max value present in the table.                string sql = @"SELECT MAX(CONVERT(INT, Code)) FROM [@" + this.TableName() + "]";                oRs = Program.oCompany.GetBusinessObject(BoObjectTypes.BoRecordset);                oRs.DoQuery(sql);                if (oRs != null && !oRs.EoF)                    maxChave = Convert.ToInt32(oRs.Fields.Item(0).Value);            }            finally            {                if (oRs != null)                {                    Marshal.ReleaseComObject(oRs);                    oRs = null;                }            }            return maxChave + 1;        }        /// <summary>        /// Creates Default query to use in GetByKey, ListAll        /// </summary>        /// <returns></returns>        private string selectQuery()        {            string sql = "SELECT ";            //walks in each property to get the property name and create the select query            foreach (PropertyInfo prop in this.GetType().GetProperties())            {                if (prop.Name.Equals("Code") || prop.Name.Equals("Name"))                    sql += prop.Name + ", ";                else                    sql += PREFIX_FIELD + prop.Name + ", ";            }            sql = sql.Substring(0, sql.Length - 2);            sql += " FROM [@" + this.TableName() + "]";            return sql;        }        #endregion        #region Public Methods        /// <summary>        /// Table name in SAP that need be managed        /// </summary>        /// <returns></returns>        public abstract string TableName();        /// <summary>        /// Add or update a register        /// </summary>        /// <returns>true if the action is ok</returns>        public virtual bool AddOrUpdate()        {            SAPbobsCOM.UserTable oUserTable = null;            bool sucess = true;            try            {                //Instantiate the class                oUserTable = Program.oCompany.UserTables.Item(this.TableName());                bool isUpdate = false;                //if code is different 0, verifies if is an update                if (this.Code != 0)                    isUpdate = oUserTable.GetByKey(this.Code.ToString());                //walks in each property of the class to get the value and set in correspondent field in the table                foreach (PropertyInfo prop in this.GetType().GetProperties())                {                    if (prop.Name.Equals("Code"))                    {                        if (!isUpdate) //if is not an update, gets the new code                        {                            this.Code = this.getNewCode(); //Sets the new code to Code property                            oUserTable.Code = this.Code.ToString();                            this.Name = "K" + oUserTable.Code;                            oUserTable.Name = this.Name;                        }                    }                    else if (prop.Name.Equals("Name"))                        continue;                    else                    {                        //convert the value of the property to value compatible with SAP type                        oUserTable.UserFields.Fields.Item(PREFIX_FIELD + prop.Name).Value = this.getValue(oUserTable.UserFields.Fields.Item(PREFIX_FIELD + prop.Name).Type, prop);                    }                }                int ret = 0;                if (isUpdate) //if is an update, updates the register                    ret = oUserTable.Update();                else //or add the new register                    ret = oUserTable.Add();                if (ret != 0)                {                    sucess = false;                    throw new Exception(Program.oCompany.GetLastErrorDescription());                }            }            catch (Exception ex)            {                sucess = false;                throw ex;            }            finally            {                Marshal.ReleaseComObject(oUserTable);                oUserTable = null;                GC.Collect();            }            return sucess;        }        /// <summary>        /// Delete a register        /// </summary>        /// <returns>true if the action is ok</returns>        public virtual bool Delete()        {            SAPbobsCOM.UserTable oUserTable = null;            bool sucess = true;            try            {                oUserTable = Program.oCompany.UserTables.Item(this.TableName());                //Verifies if exist in the table                if (oUserTable.GetByKey(this.Code.ToString()))                {                    //if exists , remove it.                    if (oUserTable.Remove() != 0)                    {                        sucess = false;                        throw new Exception(Program.oCompany.GetLastErrorDescription());                    }                }            }            catch (Exception ex)            {                sucess = false;                throw ex;            }            finally            {                Marshal.ReleaseComObject(oUserTable);                oUserTable = null;            }            return sucess;        }        /// <summary>        /// Loads the register into the object        /// </summary>        /// <param name="code">Code of your register</param>        /// <returns>true if the action is ok</returns>        public virtual bool GetByKey(int code)        {            bool sucess = true;            Recordset oRs = null;            try            {                oRs = Program.oCompany.GetBusinessObject(BoObjectTypes.BoRecordset);                //Build the query to search in the database                string sql = this.selectQuery();                //filter by the code                sql += " WHERE Code = " + code;                oRs.DoQuery(sql);                //In this situation you can too get the values by UserTable.GetByKey and then                //get values in UserFields.Fields.Item("field").Value and define in properties of your class                if (!oRs.EoF)                {                    //walks in each property and sets the value that was returned from the database                    foreach (PropertyInfo prop in this.GetType().GetProperties())                    {                        if (!prop.Name.Equals("Code") && !prop.Name.Equals("Name"))                            this.setValue(oRs.Fields.Item(PREFIX_FIELD + prop.Name).Value, prop, this);                        else                            this.setValue(oRs.Fields.Item(prop.Name).Value, prop, this);                    }                }                else                    sucess = false;            }            catch (Exception ex)            {                sucess = false;                throw ex;            }            finally            {                if (oRs != null)                {                    Marshal.ReleaseComObject(oRs);                    oRs = null;                }            }            return sucess;        }        /// <summary>        /// List all registers from the database        /// </summary>        /// <returns>Return a list with all registers of this object</returns>        public virtual List<dynamic> ListAll()        {            //Create a dynamic list, in this moment I don't know the type of my object            List<dynamic> lst = new List<dynamic>();            Recordset oRs = null;            try            {                oRs = Program.oCompany.GetBusinessObject(BoObjectTypes.BoRecordset);                //Gets the query without Where clause.                string sql = this.selectQuery();                oRs.DoQuery(sql);                while (!oRs.EoF)                {                    //create an instance of the same type of the current class                    dynamic oInstance = Activator.CreateInstance(this.GetType());                    //walks in each property and sets the value that was returned from the database                    foreach (PropertyInfo prop in oInstance.GetType().GetProperties())                    {                        //Set the value from the database to the property                        if (!prop.Name.Equals("Code") && !prop.Name.Equals("Name"))                            oInstance.setValue(oRs.Fields.Item(PREFIX_FIELD + prop.Name).Value, prop, oInstance);                        else                            oInstance.setValue(oRs.Fields.Item(prop.Name).Value, prop, oInstance);                    }                    //Adds the new object to our list                    lst.Add(oInstance);                    oRs.MoveNext();                }            }            finally            {                if (oRs != null)                {                    Marshal.ReleaseComObject(oRs);                    oRs = null;                    GC.Collect();                }            }            return lst;        }        #endregion    }
}

To demonstrate a example in the practical, I created a console application, that will be attached in the end of this text.(It is not permited add a .rar file, then I will attach all class that I created for this sample).

 

Using the class:

 

Creates an UDT as follow:

 

Table Name: Product.

Type: Not object.

 

Fields:

Product.PNG

Now I will implement my class of the UDT that I created.

 

I call my class of ProductModel and it inherits UDTModelBase. After this, you need implement the abstract methods. UDTModelBase has just one abstract method called TableName(), within this, returns your table name, in this case "Product". Now create your properties as public and with the same name and type of your fields in database. After this , your class should be seems like this:

 

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace ConsoleSample
{    public class ProductModel : UDTModelBase    {        #region Public Methods        public override string TableName()        {            return "Product";        }        #endregion        #region Public properties        public string Description { get; set; }        public Double Price { get; set; }        public DateTime RegistrationDate { get; set; }        public bool Active { get; set; }        #endregion    }
}

Now, we go test the functionalities presented. In the Program class write some test like this:

 

static void Main(string[] args)        {            try            {                if (oCompany == null || !oCompany.Connected)                {                    connect();                }                if (oCompany != null && oCompany.Connected)                {                    try                    {                        int code = 0;                        #region Add                        ProductModel product = new ProductModel();                        product.Description = "Cellphone";                        product.Price = 100.50;                        product.RegistrationDate = DateTime.Now;                        product.Active = true;                        if (product.AddOrUpdate())                        {                            Console.WriteLine("Operation completed successfully");                            code = product.Code;                        }                        product = null;                        #endregion                        #region Update                        product = new ProductModel();                        if (product.GetByKey(200))                            Console.WriteLine("Register found");                        else                            Console.WriteLine("Register not found");                        if(product.GetByKey(code))                        {                            product.Description += " new";                            if(product.AddOrUpdate())                            {                                Console.WriteLine("Operation completed successfully");                            }                        }                        product = null;                        #endregion                        #region Delete                        product = new ProductModel();                        if(product.GetByKey(code))                        {                            if(product.Delete())                                Console.WriteLine("Operation completed successfully");                        }                        #endregion                        #region ListAll                        //Inserts some registers in database before run this.                        product = new ProductModel();                        List<ProductModel> lst = product.ListAll().Cast<ProductModel>().ToList();                        foreach(ProductModel p in lst)                        {                            Console.WriteLine("Code [" + p.Code + "], Description [" + p.Description + "]");                        }                        #endregion                    }                    catch (Exception ex)                    {                        Console.WriteLine(ex.Message);                    }                }            }            catch            {                Console.WriteLine(ex.Message);            }        }

I hope that you like it this post.

If you liked it, give me your feedback.

If you used it and made some change, please share with others members of this community.

 

I intend to continue with this series of posts, if this helping other members.

 

Regards,

Diego

Viewing all 2548 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>