Verwenden Sie das PowerShell- oder Bash-Skript in diesem Abschnitt, um eine große Anzahl an Zellen aus einem Aggregrate Storage-Cube (ASO) zu exportieren.
Aufgrund der Grenzwerte, die durch die Oracle Essbase-Konfigurationseinstellung QUERYRESULTLIMIT vorgegeben sind, ist es nicht möglich, eine große Datenmenge über die Benutzeroberfläche zu exportieren. Das in diesem Abschnitt verfügbare PowerShell-Skript teilt den Export in eine festgelegte Anzahl von Jobs auf, führt die einzelnen Jobs aus, lädt die exportierten Daten herunter und verkettet die Exportdateien zu einer Exportdatei. So wird sichergestellt, dass nur ein Header vorhanden ist.
Hinweis:
Diese Skripte führen einen vorhandenen Job vom Typ "Daten exportieren" aus. Ausführliche Anweisungen zum Erstellen von Jobs finden Sie unter Jobs verwalten in der Dokumentation Planning verwalten.PowerShell-Skript
$user = '<USERNAME>' $pass = '<PASSWORD>' $serverURL = '<URL>' $applicationName = '<APPLICATIONNAME>' $cubeName = '<CUBENAME>' $splitDimension = '<DIMENSION_TO_SPLIT_THE_EXPORT>' $topLevelMemberForExport = '<TOP_MEMBER_FOR_EXPORT>' $exportJobName = '<EXPORT_JOB_NAME>' $exportFilePrefix = '<PREFIX_FOR_EXPORT_FILE>' $columnMembers = '<MEMBERS_ON_COLUMNS>' $povMembers = '<POV_MEMBERS>' $numberOfExportFiles = <NUMBER_OF_FILES_TO_SPLIT_THE_EXPORT> $memberArray = @() $exportFileArray = @() function getLevel0 ($parent) { $parent.children.ForEach({ if ( $_.children.count -eq 0 ) { $script:memberArray += $_.name } getLevel0($_) }) } function findMember ($tree, $memberName) { $subtree = "" if ($tree.name -eq $memberName){ return $tree } else { $tree.children.ForEach({ #Write-Host $_.name if ($subtree -eq ""){ $subtree = findMember $_ $memberName} }) return $subtree } } #putting together base64 encoded authentication header based un user and password $encodedCredentials = [Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($($user) + ":" + $($pass))) $headers = @{ Authorization = "Basic $encodedCredentials" } #test login $testRequest = $serverURL + '/HyperionPlanning/rest/v3/applications' try { $response = Invoke-RestMethod -Uri $testRequest -Method Get -Headers $headers -UseBasicParsing } catch { Write-Host $_ return } #retrieve dimension hierarchy from application Write-Host "Retrieving member list for split dimension " $splitDimension $request = $serverURL + '/HyperionPlanning/rest/v3/internal/applications/' + $applicationName + '/plantypes/' + $cubeName + '/dimensions/' + $splitDimension try { $response = Invoke-RestMethod -Uri $request -Method Get -Headers $headers -UseBasicParsing } catch { Write-Host $_ return } Write-Host $splitDimension " member list retrieved" #search for the top of the export hierarchy Write-Host "Searching for member " $topLevelMemberForExport " in hierarchy" $member = findMember $response $topLevelMemberForExport if ( $member.name -ne $topLevelMemberForExport ) { Write-Host $topLevelMemberForExport " not found in hierarchy, exiting ..." return 128 } Write-Host "Found member " $topLevelMemberForExport " in hierarchy" #retrieve level 0 memebers in export hierarchy Write-Host "Retrieving Level 0 members for hierarchy" getLevel0($member) if ( $memberArray.Length -eq 0 ) { Write-Host "no level 0 members found in hierarchy, exiting ..." return 128 } Write-Host $memberArray.Length " Level 0 members for export hierarchy retrieved" $request = $serverURL + '/HyperionPlanning/rest/v3/applications/' + $applicationName + '/jobs' #splitting member list into the number of export files $numberOfEntitiesPerFile = [math]::truncate($memberArray.Length / $numberOfExportFiles) for ($i = 1; $i -le $numberOfExportFiles; $i++) { $memberList = "" $firstMember = ($i - 1) * $numberOfEntitiesPerFile if ($i -lt $numberOfExportFiles) { $lastMember = $i * $numberOfEntitiesPerFile } else { $lastMember = $i * $numberOfEntitiesPerFile + $memberArray.Length % $numberOfExportFiles } for ($j = $firstMember; $j -lt $lastMember; $j++) { $memberList += $memberArray[$j] if ($j -lt $lastMember - 1) {$memberList += ","} #avoid adding a comma (,) after the last member of each set } $jobDetails=' { "jobType":"EXPORT_DATA","jobName":"' + $exportJobName + '", "parameters":{ "exportFileName":"Export-' + $i + '.zip", "rowMembers":"' + $memberList + '", "columnMembers":"' + $columnMembers + '", "povMembers":"' + $povMembers + '" } }' #start export job try{ $response = Invoke-RestMethod -Uri $request -Method Post -Headers $headers -Body $jobDetails -ContentType "application/json"} catch { Write-Host $_ return } Write-Host "Started export job " $i " out of " $numberOfExportFiles #checking job status, continue once jos is completed $statusRequest = $serverURL + '/HyperionPlanning/rest/v3/applications/' + $applicationName + '/jobs/' + $response.jobId $statusResponse = Invoke-RestMethod -Uri $statusRequest -Method Get -Headers $headers -UseBasicParsing while ( $statusResponse.descriptiveStatus -eq "Processing" ) { Write-Host $statusResponse.descriptiveStatus Start-Sleep -s 10 $statusResponse = Invoke-RestMethod -Uri $statusRequest -Method Get -Headers $headers -UseBasicParsing } Write-Host $statusResponse.descriptiveStatus Write-Host "Downloading export file ..." $downloadRequest = $serverURL + '/interop/rest/11.1.2.3.600/applicationsnapshots/Export-' + $i + '.zip/contents' $statusResponse = Invoke-RestMethod -Uri $downloadRequest -Method Get -Headers $headers -OutFile "$exportFilePrefix-$i.zip" Write-Host "Expanding archive ..." Expand-Archive -Force -LiteralPath "$exportFilePrefix-$i.zip" -DestinationPath "$exportFilePrefix-$i" Remove-Item "$exportFilePrefix-$i.zip" Get-ChildItem -Path "$exportFilePrefix-$i" -File -Name | ForEach-Object { $exportFileArray += "$exportFilePrefix-$i\" + $_ } } Write-Host "creating outputfile ..." #write header to outputfile Get-Content $exportFileArray[0] | Select-Object -First 1 | Out-File "$exportFilePrefix.csv" #write content to outputfile skipping header ForEach ($exportFile in $exportFileArray) { Get-Content $exportFile | Select-Object -Skip 1 | Out-File -Append "$exportFilePrefix.csv" } Compress-Archive -LiteralPath "$exportFilePrefix.csv" -DestinationPath "$exportFilePrefix.zip" Write-Host "cleaning up ..." Remove-Item "$exportFilePrefix-*" -Recurse Remove-Item "$exportFilePrefix.csv"
Bash-Skript
#!/bin/bash
user='<USERNAME>'
pass='<PASSWORD>'
serverURL='<URL>'
applicationName='<APPLICATIONNAME>'
cubeName='<CUBENAME>'
splitDimension='<DIMENSION_TO_SPLIT_THE_EXPORT>'
topLevelMemberForExport='<TOP_MEMBER_FOR_EXPORT>'
exportJobName='<EXPORT_JOB_NAME>'
exportFilePrefix='<PREFIX_FOR_EXPORT_FILE>'
columnMembers='<MEMBERS_ON_COLUMNS>'
povMembers='<POV_MEMBERS>'
numberOfExportFiles=<NUMBER_OF_FILES_TO_SPLIT_THE_EXPORT>
getRowMembers() {
local memberList="$1"
local firstMember=$2
local lastMember=$3
local nameCount=0
local rowMember=""
local rowMembers=""
while IFS= read -r line
do
if [[ "${line}" == *"name"* ]]
then
if [[ ${nameCount} -ge ${firstMember} ]] && [[ ${nameCount} -lt ${lastMember} ]]
then
rowMember=$(echo "${line}" | cut -d':' -f2- | sed s'/[",]//g')
rowMembers="${rowMembers}${rowMember},"
fi
((nameCount+=1))
fi
done <<< "${memberList}"
rowMembers=$(echo "${rowMembers}" | rev | cut -d',' -f2- | rev)
echo "${rowMembers}"
}
getLevel0()
{
local memberList="$1"
local names=$(echo "${memberList}" | jq 'recurse (try .children[]) | .name' | sed -e 's/"//g')
local elements=""
formerIFS=$IFS
IFS=$'\n'
namesarr=($names)
IFS=$formerIFS
for i in ${!namesarr[@]}
do
testelement=$(echo "${memberList}" | jq --arg currentName "${namesarr[i]}" 'recurse (try .children[]) | select(.name==$currentName)')
if [[ "${testelement}" != *"children"* ]]
then
elements="${elements}${testelement}"
fi
done
echo "${elements}"
}
#test login
header="Content-Type: application/x-www-form-urlencoded"
applicationsRequest="${serverURL}/HyperionPlanning/rest/v3/applications"
response=$(curl -X "GET" -s -w "%{http_code}" -u "${user}:${pass}" -H "${header}" "${applicationsRequest}")
http_response_code=$(echo "${response}" | rev | cut -d'}' -f1 | rev)
if [ ${http_response_code} -ne 200 ]
then
echo "${response}"
exit
fi
#retrieve dimension hierarchy from application
echo "Retrieving member list for split dimension ${splitDimension}"
splitDimensionRequest="${serverURL}/HyperionPlanning/rest/v3/internal/applications/${applicationName}/plantypes/${cubeName}/dimensions/${splitDimension}"
response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-memberlist.txt" -D "respHeader-memberlist.txt" -H "${header}" "${splitDimensionRequest}")
http_response_code=$(echo "${response}" | rev | cut -d'}' -f1 | rev)
if [ ${http_response_code} -ne 200 ]
then
echo "${response}"
exit
fi
echo "${splitDimension} member list retrieved"
#search for the top of the export hierarchy
echo "Searching for member ${topLevelMemberForExport} in hierarchy"
memberList=$(cat response-memberlist.txt | jq --arg topLevelMember "${topLevelMemberForExport}" 'recurse(try .children[]) | select (.name == $topLevelMember)')
if [[ "${memberList}" == "" ]]
then
echo "${topLevelMemberForExport} not found in hierarchy, exiting ..."
exit 128
fi
echo "Found member ${topLevelMemberForExport} in hierarchy"
#retrieve level 0 members in export hierarchy
echo "Retrieving Level 0 members for hierarchy"
totalCount=$(echo "${memberList}" | grep "name" | wc -l)
grepChildrenCount=$(echo "${memberList}" | grep "children" | wc -l)
levelZeroCount=$((totalCount-grepChildrenCount))
if [[ "${levelZeroCount}" -eq 0 ]]
then
echo "no level 0 members found in hierarchy, exiting ..."
exit 128
fi
echo "${levelZeroCount} Level 0 members for export hierarchy retrieved"
#splitting member list into the number of export files
numberOfEntitiesPerFile=$((levelZeroCount/numberOfExportFiles))
jobsRequest="${serverURL}/HyperionPlanning/rest/v3/applications/${applicationName}/jobs"
header="Content-Type: application/json"
for ((i = 1 ; i <= ${numberOfExportFiles}; i++))
do
firstMember=$((($i-1)*numberOfEntitiesPerFile))
if [[ ${i} -lt ${numberOfExportFiles} ]]
then
lastMember=$((i*numberOfEntitiesPerFile))
else
lastMember=$((i*numberOfEntitiesPerFile+levelZeroCount%numberOfExportFiles))
fi
elements=$(getLevel0 "${memberList}")
rowMembers=$(getRowMembers "${elements}" ${firstMember} ${lastMember})
response=$(curl -X POST -s -w "%{http_code}" -u "${user}:${pass}" -o "response-job.txt" -D "respHeader-job.txt" -H "${header}" "${jobsRequest}" -d '{"jobType":"EXPORT_DATA","jobName":"'"${exportJobName}"'","parameters":{"exportFileName":"Export-'"${i}"'.zip","rowMembers":"'"${rowMembers}"'","columnMembers":"'"${columnMembers}"'","povMembers":"'"${povMembers}"'"}}')
echo "Started export job " $i " out of " $numberOfExportFiles
jobId=$(cat response-job.txt | grep -o '"jobId":[^, }]*' | cut -d':' -f2)
descriptiveStatus=$(cat response-job.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g')
jobIdRequest="${serverURL}/HyperionPlanning/rest/v3/applications/${applicationName}/jobs/${jobId}"
response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-jobstatus.txt" -D "respHeader-jobstatus.txt" -H "${header}" "${jobIdRequest}")
jobId=$(cat response-jobstatus.txt | grep -o '"jobId":[^, }]*' | cut -d':' -f2)
descriptiveStatus=$(cat response-jobstatus.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g')
while [[ "${descriptiveStatus}" == "Processing" ]]
do
echo "${descriptiveStatus}"
sleep 10
response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-jobstatus.txt" -D "respHeader-jobstatus.txt" -H "${header}" "${jobIdRequest}")
descriptiveStatus=$(cat response-jobstatus.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g')
done
echo "${descriptiveStatus}"
echo "Downloading export file ..."
contentsRequest="${serverURL}/interop/rest/11.1.2.3.600/applicationsnapshots/Export-${i}.zip/contents"
curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -D "respHeader-download.txt" "${contentsRequest}" > "${exportFilePrefix}-${i}.zip"
echo "Expanding archive ..."
unzip "${exportFilePrefix}-${i}.zip" -d "${exportFilePrefix}-${i}"
rm "${exportFilePrefix}-${i}.zip"
echo "Writing to outputfile ..."
if [[ -d "${exportFilePrefix}-${i}" ]]
then
find "${exportFilePrefix}-${i}" -name \*.csv | xargs cat | tail -n +2 >> "${exportFilePrefix}.csv"
fi
done
zip "${exportFilePrefix}.zip" "${exportFilePrefix}.csv"
echo "cleaning up ..."
find . -name "${exportFilePrefix}-*" | xargs rm -r
rm "${exportFilePrefix}.csv"
So exportieren Sie eine große Anzahl von Zellen aus einem Aggregate Storage-Cube (ASO):
ASOCellExport.ps1 oder ASOCellExport.sh.Tabelle 3-16 In die PowerShell- und Bash-Skripte einzuschließende Variablenwerte
| Variable | Beschreibung |
|---|---|
user |
Domain und Benutzername für einen Serviceadministrator im Format DOMAIN.USER.
Beispiele: Windows: Linux/UNIX: |
pass |
Kennwort des Serviceadministrators oder Speicherort der verschlüsselten Kennwortdatei. Informationen zum Erstellen einer verschlüsselten Kennwortdatei finden Sie im Abschnitt zum Befehl encrypt.
Beispiele: Windows: Linux/UNIX: |
serverURL |
Die URL der Oracle Fusion Cloud Enterprise Performance Management-Umgebung.
Beispiele: Windows: Linux/UNIX: |
applicationName |
Name einer Planning-, Financial Consolidation and Close-, Tax Reporting- oder Enterprise Profitability and Cost Management-Anwendung.
Beispiele: Windows: Linux/UNIX: |
cubeName |
Name eines Cubes in der Anwendung.
Beispiele: Windows: Linux/UNIX: |
splitDimension |
Name einer Dimension, deren Elemente dazu verwendet werden, den Export in Gruppen aufzuteilen.
Beispiele: Windows: Linux/UNIX: |
topLevelMemberForExport |
Name eines Elements der Dimensionsunterhierarchie, unter der eine Liste von Elementen der Ebene 0 erstellt wird.
Beispiele: Windows: Linux/UNIX: |
exportJobName |
Name eines vorhandenen Jobs vom Typ "Daten exportieren". Die in diesem Job angegebenen Einstellungen werden durch die Parameter überschrieben, die Sie im Skript festlegen.
Beispiele: Windows: Linux/UNIX: |
exportFilePrefix |
Ein Dateinamenspräfix zur eindeutigen Identifizierung der Dateien, die vom Exportjob generiert werden.
Beispiele: Windows: Linux/UNIX: |
columnMembers |
Die Elementspalten, die in den Export eingeschlossen werden sollen.
Beispiele: Windows: Linux/UNIX: |
povMembers |
Point of Views, die in den Export eingeschlossen werden sollen. POV-Elemente müssen alle anderen Dimensionen enthalten und können Funktionen aufweisen, wie im Folgenden gezeigt:
Beispiele: Windows: Linux/UNIX: |
numberOfExportFiles |
Anzahl von Jobs, die für diesen Exportvorgang ausgeführt werden sollen. Erhöhen Sie diese Zahl, wenn die Einschränkungen aufgrund von Abfragegrenzwerten weiterhin dazu führen, dass der Export nicht erfolgreich ist.
Beispiele: Windows: Linux/UNIX: |