Hi Gregor,
Sorry missed that it was only failing on update, do you get any errors?
If it works for the create I assume you have a valid CSRF token etc.
Regards,
Jason
Hi Gregor,
Sorry missed that it was only failing on update, do you get any errors?
If it works for the create I assume you have a valid CSRF token etc.
Regards,
Jason
Hello All,
Right now, i am using dell inspiron 14" laptop...with 4 GB Ram and I5 processor and 250 GB..i am not facing any issues yet...with good anti-virus....
should i need have temporaily 8GB RAM.
Will the below configguration serve my purpose or not
Dell Inspiron 15-3542 Notebook
Operating System:Windows® 8.1
Processor:Intel Core i3-4030U
Memory:4GB
Display:15.6” Screen
Storage Drive:1TB Hard Drive
Optical Drive:DVDRW
Hi Shruti
Please close the thread.
Regards
Surah
Hi Kapil,
8GB ram will speed up your system and good when you run multiple screen tcode at same time.
Below is good config change RAM.
Dell Inspiron 15-3542 Notebook
Operating System:Windows® 8.1
Processor:Intel Core i3-4030U
Memory:4GB -----------------> 8 GB
Display:15.6” Screen
Storage Drive:1TB Hard Drive
Optical Drive:DVDRW
Regards
Suraj
Srini,
This is not a Cross tab but a very simple vertical table with 3 columns : STATE, CITY and SALES.
I have simply a state and City dimension and there are sales reported City wise. usually state to city is one to many so if I pull State and City and the sales, it gives me sales per multiple cities per state.
Now the requirement is to display only one city per state that has the highest city Sales in that state. This I have achieved by tweaking the SALES metric column with a formula as follows:
[XYZ] =Max([SALES]) ForAll ([CITY])
Now I also tweak the CITY dimension as follows:
=[CITY] Where ([SALES] = [XYZ])
Basically What I did was identified out of all the Cities of a state whats the MAX Sales and restrict the table to show only one city per state with MAX sales as required.
But now there's a small adjustment that needs to be done as per the user. If 2 or 3 Cities have the same MAX Sales, then I need to show all of them. But with this approach I get #Multivalue error wherever there's more than one city with MAX Sales.
I appreciate your help and response on this Srini. But How to resolve this?
Thanks,
RCC
Thanks Prakhar,
I am following below link, to configure Solman (to create Customized Service request)
Created personalized TCODE ZMRQ by coping SMRQ...
when i have created Service request it's still creating with SMRQ, Please advice.
Hi All,
I have created an RTTS output but see that the icons for save to excel sorting filtering etc are not being displayed. I used the RTTS example Crate Dynamic Table using RTTS and display in ALV. In this example there is no field catalog etc. Is there a way to have the icons using ALV . The example uses method cl_salv_table
Thanks,
Please let me know the difference among these 2 certifications and which one is more beneficial to gain. As per the course content both seems to be following HA200.
Please suggest
Looking to set up SSO from BOE to HANA using SAML and coming up short on what is hopefully just some missing configuration. If anyone has experience getting this running, I'd be grateful for feedback or links to more comprehensive documentation.
We are running BOE 4.1 SP5 and HANA rev 92 (on a multiple node installation). The plan is to 1) enable SSL logins on HANA, 2) set up BOE as the IdP, 3) create the SAML provider in HANA and establish trust between the two systems.
Everything has been restarted after the last configuration change.
A test user has been set up in HANA with the SAML provider enabled, user name matching a BOE enterprise account. When testing from the CMC, we see the following error message: Connection Failed: The test of the HANA SSO ticket used to log onto the HANA DB has failed due to: [10]: invalid username or password. (FWM 02133)
The HANA tracelog, set to debug, shows some errors in SAMLAuthenticator (ERROR in libxmlsec) before it culminates in this block:
[22277]{-1}[-1/-1] 2015-02-02 20:10:23.882796 i Authentication SAMLAuthenticator.cpp(00400) : Unable to verify XML signature
[22277]{-1}[-1/-1] 2015-02-02 20:10:23.882934 d Authentication ManagerAcceptor.cpp(00273) : Injecting logon name into method:
[22277]{-1}[-1/-1] 2015-02-02 20:10:23.882986 d Authentication SAPLogonManager.cpp(00360) : Store chosen for assertion ticket validation: saplogon.pse
[22277]{-1}[-1/-1] 2015-02-02 20:10:23.883114 w Authentication SAPLogonManager.cpp(00504) : The base64 decode of the received ticket failed. SSO_RC return value: 1281
[22277]{-1}[-1/-1] 2015-02-02 20:10:23.883121 d Authentication SAPLogonManager.cpp(00513) : Use SSO Validation PSE >>>saplogon.pse<<<
[22277]{-1}[-1/-1] 2015-02-02 20:10:23.883123 d Authentication SAPLogonManager.cpp(00514) : Received Base64 Ticket >>>SAML 2.0 assertion ticket...<<<
[22277]{-1}[-1/-1] 2015-02-02 20:10:23.883167 i Authentication MethodSAPLogon.cpp(00275) : unsuccessful login attempt with SAPLogon/SAPAssertion ticket!
[22277]{-1}[-1/-1] 2015-02-02 20:10:23.883181 d Authentication ManagerAcceptor.cpp(00273) : Injecting logon name into method:
[22277]{-1}[63/-1] 2015-02-02 20:10:23.884313 d Authentication Connection.cc(03617) : [PRE AUTHENTICATION] logon name:
[22277]{-1}[63/-1] 2015-02-02 20:10:23.884359 d Authentication Connection.cc(03684) : [POST AUTHENTICATION] logon name:
It looks like the ticket is received but not being parsed. It's not clear to me if this is related to the certificate or some other configuration element, or exactly what the missing piece is.
David, the fields that are looked up in the form normally are not part of the form's interface, so how would you pass data to the form - enhance the interface structure as well? That's exactly where adding code in the form is starting to look more appealing by the second.
Hello Frank,
you have debugged quite a while I guess. Nevertheless it is not possible to overcome the limitation and also to stay with the API. Please repect the package interfaces and stick to the published API classes. Ever heard of ancient greek jewelery (p.......s b.x)? Using side effects of implementation details seldomly lead to an happy ending.
Regards,
Klaus
then the request is marked for deletion but you will still see the data since the partition is not dropped yet. when all the requests in the partition are marked for deletion then it will be dropped and data will disappear till then you will see the data in the changelog table.
Hi,
Thanks for information.
I am worried about few more steps like
Do we need to request BI migration after DB migration or no need?
How about load balancing
BODS migration after BOE or we can perform before too?
Sharepoint ?
Anyone have any document or note regarding all these kind of factors?
Can you verify if the Time Management integration is ON.
If you do not need it turn it off and that should resolve the issue.
Good Evening Frank,
In general constraints do not depend on cl_Abap_Unit_Assert=>assert_That( ). Therefore it shall be possible to code any unit test for your custom constraint. As you mentioned there exists an important limitation. Using the methods of cl_Abap_Unit_Assert within the implementation of the constraint results in an weird test protocol. There exists no possibility to overcome this limitation.
Hope this helps/clarifies
Klaus
Hi All,
I have a weird situation here. At around 9:30pm every night, i get emails sent to me that our concurrent users limit is exceeding (have a watch to monitor concurrent sessions) and webi processing server is out of memory or connection limit has reached. Keep in mind, there are no users logged in at that point of time. All our users are done by 6pm using the environment. How is that even possible, when there are no users in the system and nobody is running reports? Could someone please help understand this behavior? Thank you.
Hi,
I think, if you don't enable the features of IQ, you need to execute a sp_iqlmconfig like below.
- sp_iqlmconfig 'allow' , 'all'
[Related KBA]
https://i7p.wdf.sap.corp/sap/support/notes/2040456
And you need to check the number of core using cpuinfo utility.
~/SYSAM-2_0/bin/cpuinfo -v
If it's insufficient, you need to regenerate the license file.
Lastly, LT/PE of xxx.lmp and license file have to be identical.
Ex) LT=CP/PE=EE
If it's not same, you need to edit the LT/PE in xxx.lmp file
HTH
Gi-Sung Jang
Raj
Find my Responses
1. I have created configuration and loaded KNA1 table from ECC system to HANA system. It is successfully loaded. I have checked SM37 in SLT server for job details, i am not able to find any job created for this run and parallely i checked in hana studion-->dataprovisioning-->jobs, here also i am not finding any job created for that loading of KNA1 from ECC to HANA system.
So now i want to know where can i see the jobs created in background to load KNA1 data into HANA because i want to measure the time taken for loading KNA1 data into HANA. please check and help me to get the job details, do i need to check SM37 in ECC system, please clarify. In the meantime, please tell me where can i get complete statistics of TIME taken to load KNA1 data into HANA from ECC system.
Response:
You will see something like that
DD02L /1CADMC/00000501 Replication TRANSP X Activated Table created Table created Synonym created
If you see the replication status has initial load, then the data is still loading from Source to Target system. You can check the status of load from Load Statistics tab. In the selection select “Load in process” and check if you see the records. If you Refresh then you see the records are being calculated and read.
If the above steps does not work, then you can check the Jobs from Sm37. You will see jobs like this
/1LT/IUC_LOAD_MT_016_010 SAPBASIS Active 02/03/2015 00:00:14 63,019
/1LT/IUC_REP_MSTR SAPBASIS Active 02/03/2015 00:00:05 63,028
IUC_LOAD_MT are the Data Transfer jobs used to transfer the data between the systems. IUC_REP_MSTR is the master job which controls all the SLT jobs in the system. If you don’t see any Jobs in SM37, then the master job must have stopped.
Check and let me know these things
a) Do i need to do any additional settings to replicate these two BSEG and BKPF tables because these tables are cluster tables and these tables are having huge amount of data in it.
Response
If you would like to replicate normally with single threaded, then you don’t need additional settings. But if you would like to use Parallelization from Tobias Blog, then you need to make some changes to tables before you start the replication.
b) I had see a blog from Tobias Koebler on "How To filter on the initial load & parallelize replication DMIS 2011 SP06 or higher", please check and suggest me whether i need to follow these steps for replicating BSEG and BKPF tables. please clarify.
Response
Tobias has written a very good blow on improving the initial load for large tables. You can use this for both cluster and transparent tables. I have implemented this in our landscape and working well. Check this Link
If you need any help with the Parallel replication let me know.
Mahesh Shetty