Utilisez le script PowerShell ou Bash de cette section pour exporter un grand nombre de cellules à partir d'un cube en mode "aggregate storage" (ASO).
Les limites imposées par le paramètre QUERYRESULTLIMIT d'Oracle Essbase rendent impossible l'export de grandes quantités de données à partir de l'interface utilisateur. Le script PowerShell disponible dans cette section fractionne l'opération d'export en un nombre défini de jobs, exécute chaque job, télécharge les données exportées, puis concatène les fichiers d'export en un seul fichier pour garantir qu'un seul en-tête est utilisé.
Remarque :
Ces scripts exécutent un job existant de type Exporter les données. Pour obtenir des instructions détaillées sur la création de jobs, reportez-vous à la section "Gestion des jobs" du guide Administration de Planning.Script PowerShell
$user = '<USERNAME>' $pass = '<PASSWORD>' $serverURL = '<URL>' $applicationName = '<APPLICATIONNAME>' $cubeName = '<CUBENAME>' $splitDimension = '<DIMENSION_TO_SPLIT_THE_EXPORT>' $topLevelMemberForExport = '<TOP_MEMBER_FOR_EXPORT>' $exportJobName = '<EXPORT_JOB_NAME>' $exportFilePrefix = '<PREFIX_FOR_EXPORT_FILE>' $columnMembers = '<MEMBERS_ON_COLUMNS>' $povMembers = '<POV_MEMBERS>' $numberOfExportFiles = <NUMBER_OF_FILES_TO_SPLIT_THE_EXPORT> $memberArray = @() $exportFileArray = @() function getLevel0 ($parent) { $parent.children.ForEach({ if ( $_.children.count -eq 0 ) { $script:memberArray += $_.name } getLevel0($_) }) } function findMember ($tree, $memberName) { $subtree = "" if ($tree.name -eq $memberName){ return $tree } else { $tree.children.ForEach({ #Write-Host $_.name if ($subtree -eq ""){ $subtree = findMember $_ $memberName} }) return $subtree } } #putting together base64 encoded authentication header based un user and password $encodedCredentials = [Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($($user) + ":" + $($pass))) $headers = @{ Authorization = "Basic $encodedCredentials" } #test login $testRequest = $serverURL + '/HyperionPlanning/rest/v3/applications' try { $response = Invoke-RestMethod -Uri $testRequest -Method Get -Headers $headers -UseBasicParsing } catch { Write-Host $_ return } #retrieve dimension hierarchy from application Write-Host "Retrieving member list for split dimension " $splitDimension $request = $serverURL + '/HyperionPlanning/rest/v3/internal/applications/' + $applicationName + '/plantypes/' + $cubeName + '/dimensions/' + $splitDimension try { $response = Invoke-RestMethod -Uri $request -Method Get -Headers $headers -UseBasicParsing } catch { Write-Host $_ return } Write-Host $splitDimension " member list retrieved" #search for the top of the export hierarchy Write-Host "Searching for member " $topLevelMemberForExport " in hierarchy" $member = findMember $response $topLevelMemberForExport if ( $member.name -ne $topLevelMemberForExport ) { Write-Host $topLevelMemberForExport " not found in hierarchy, exiting ..." return 128 } Write-Host "Found member " $topLevelMemberForExport " in hierarchy" #retrieve level 0 memebers in export hierarchy Write-Host "Retrieving Level 0 members for hierarchy" getLevel0($member) if ( $memberArray.Length -eq 0 ) { Write-Host "no level 0 members found in hierarchy, exiting ..." return 128 } Write-Host $memberArray.Length " Level 0 members for export hierarchy retrieved" $request = $serverURL + '/HyperionPlanning/rest/v3/applications/' + $applicationName + '/jobs' #splitting member list into the number of export files $numberOfEntitiesPerFile = [math]::truncate($memberArray.Length / $numberOfExportFiles) for ($i = 1; $i -le $numberOfExportFiles; $i++) { $memberList = "" $firstMember = ($i - 1) * $numberOfEntitiesPerFile if ($i -lt $numberOfExportFiles) { $lastMember = $i * $numberOfEntitiesPerFile } else { $lastMember = $i * $numberOfEntitiesPerFile + $memberArray.Length % $numberOfExportFiles } for ($j = $firstMember; $j -lt $lastMember; $j++) { $memberList += $memberArray[$j] if ($j -lt $lastMember - 1) {$memberList += ","} #avoid adding a comma (,) after the last member of each set } $jobDetails=' { "jobType":"EXPORT_DATA","jobName":"' + $exportJobName + '", "parameters":{ "exportFileName":"Export-' + $i + '.zip", "rowMembers":"' + $memberList + '", "columnMembers":"' + $columnMembers + '", "povMembers":"' + $povMembers + '" } }' #start export job try{ $response = Invoke-RestMethod -Uri $request -Method Post -Headers $headers -Body $jobDetails -ContentType "application/json"} catch { Write-Host $_ return } Write-Host "Started export job " $i " out of " $numberOfExportFiles #checking job status, continue once jos is completed $statusRequest = $serverURL + '/HyperionPlanning/rest/v3/applications/' + $applicationName + '/jobs/' + $response.jobId $statusResponse = Invoke-RestMethod -Uri $statusRequest -Method Get -Headers $headers -UseBasicParsing while ( $statusResponse.descriptiveStatus -eq "Processing" ) { Write-Host $statusResponse.descriptiveStatus Start-Sleep -s 10 $statusResponse = Invoke-RestMethod -Uri $statusRequest -Method Get -Headers $headers -UseBasicParsing } Write-Host $statusResponse.descriptiveStatus Write-Host "Downloading export file ..." $downloadRequest = $serverURL + '/interop/rest/11.1.2.3.600/applicationsnapshots/Export-' + $i + '.zip/contents' $statusResponse = Invoke-RestMethod -Uri $downloadRequest -Method Get -Headers $headers -OutFile "$exportFilePrefix-$i.zip" Write-Host "Expanding archive ..." Expand-Archive -Force -LiteralPath "$exportFilePrefix-$i.zip" -DestinationPath "$exportFilePrefix-$i" Remove-Item "$exportFilePrefix-$i.zip" Get-ChildItem -Path "$exportFilePrefix-$i" -File -Name | ForEach-Object { $exportFileArray += "$exportFilePrefix-$i\" + $_ } } Write-Host "creating outputfile ..." #write header to outputfile Get-Content $exportFileArray[0] | Select-Object -First 1 | Out-File "$exportFilePrefix.csv" #write content to outputfile skipping header ForEach ($exportFile in $exportFileArray) { Get-Content $exportFile | Select-Object -Skip 1 | Out-File -Append "$exportFilePrefix.csv" } Compress-Archive -LiteralPath "$exportFilePrefix.csv" -DestinationPath "$exportFilePrefix.zip" Write-Host "cleaning up ..." Remove-Item "$exportFilePrefix-*" -Recurse Remove-Item "$exportFilePrefix.csv"
Script Bash
#!/bin/bash
user='<USERNAME>'
pass='<PASSWORD>'
serverURL='<URL>'
applicationName='<APPLICATIONNAME>'
cubeName='<CUBENAME>'
splitDimension='<DIMENSION_TO_SPLIT_THE_EXPORT>'
topLevelMemberForExport='<TOP_MEMBER_FOR_EXPORT>'
exportJobName='<EXPORT_JOB_NAME>'
exportFilePrefix='<PREFIX_FOR_EXPORT_FILE>'
columnMembers='<MEMBERS_ON_COLUMNS>'
povMembers='<POV_MEMBERS>'
numberOfExportFiles=<NUMBER_OF_FILES_TO_SPLIT_THE_EXPORT>
getRowMembers() {
local memberList="$1"
local firstMember=$2
local lastMember=$3
local nameCount=0
local rowMember=""
local rowMembers=""
while IFS= read -r line
do
if [[ "${line}" == *"name"* ]]
then
if [[ ${nameCount} -ge ${firstMember} ]] && [[ ${nameCount} -lt ${lastMember} ]]
then
rowMember=$(echo "${line}" | cut -d':' -f2- | sed s'/[",]//g')
rowMembers="${rowMembers}${rowMember},"
fi
((nameCount+=1))
fi
done <<< "${memberList}"
rowMembers=$(echo "${rowMembers}" | rev | cut -d',' -f2- | rev)
echo "${rowMembers}"
}
getLevel0()
{
local memberList="$1"
local names=$(echo "${memberList}" | jq 'recurse (try .children[]) | .name' | sed -e 's/"//g')
local elements=""
formerIFS=$IFS
IFS=$'\n'
namesarr=($names)
IFS=$formerIFS
for i in ${!namesarr[@]}
do
testelement=$(echo "${memberList}" | jq --arg currentName "${namesarr[i]}" 'recurse (try .children[]) | select(.name==$currentName)')
if [[ "${testelement}" != *"children"* ]]
then
elements="${elements}${testelement}"
fi
done
echo "${elements}"
}
#test login
header="Content-Type: application/x-www-form-urlencoded"
applicationsRequest="${serverURL}/HyperionPlanning/rest/v3/applications"
response=$(curl -X "GET" -s -w "%{http_code}" -u "${user}:${pass}" -H "${header}" "${applicationsRequest}")
http_response_code=$(echo "${response}" | rev | cut -d'}' -f1 | rev)
if [ ${http_response_code} -ne 200 ]
then
echo "${response}"
exit
fi
#retrieve dimension hierarchy from application
echo "Retrieving member list for split dimension ${splitDimension}"
splitDimensionRequest="${serverURL}/HyperionPlanning/rest/v3/internal/applications/${applicationName}/plantypes/${cubeName}/dimensions/${splitDimension}"
response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-memberlist.txt" -D "respHeader-memberlist.txt" -H "${header}" "${splitDimensionRequest}")
http_response_code=$(echo "${response}" | rev | cut -d'}' -f1 | rev)
if [ ${http_response_code} -ne 200 ]
then
echo "${response}"
exit
fi
echo "${splitDimension} member list retrieved"
#search for the top of the export hierarchy
echo "Searching for member ${topLevelMemberForExport} in hierarchy"
memberList=$(cat response-memberlist.txt | jq --arg topLevelMember "${topLevelMemberForExport}" 'recurse(try .children[]) | select (.name == $topLevelMember)')
if [[ "${memberList}" == "" ]]
then
echo "${topLevelMemberForExport} not found in hierarchy, exiting ..."
exit 128
fi
echo "Found member ${topLevelMemberForExport} in hierarchy"
#retrieve level 0 members in export hierarchy
echo "Retrieving Level 0 members for hierarchy"
totalCount=$(echo "${memberList}" | grep "name" | wc -l)
grepChildrenCount=$(echo "${memberList}" | grep "children" | wc -l)
levelZeroCount=$((totalCount-grepChildrenCount))
if [[ "${levelZeroCount}" -eq 0 ]]
then
echo "no level 0 members found in hierarchy, exiting ..."
exit 128
fi
echo "${levelZeroCount} Level 0 members for export hierarchy retrieved"
#splitting member list into the number of export files
numberOfEntitiesPerFile=$((levelZeroCount/numberOfExportFiles))
jobsRequest="${serverURL}/HyperionPlanning/rest/v3/applications/${applicationName}/jobs"
header="Content-Type: application/json"
for ((i = 1 ; i <= ${numberOfExportFiles}; i++))
do
firstMember=$((($i-1)*numberOfEntitiesPerFile))
if [[ ${i} -lt ${numberOfExportFiles} ]]
then
lastMember=$((i*numberOfEntitiesPerFile))
else
lastMember=$((i*numberOfEntitiesPerFile+levelZeroCount%numberOfExportFiles))
fi
elements=$(getLevel0 "${memberList}")
rowMembers=$(getRowMembers "${elements}" ${firstMember} ${lastMember})
response=$(curl -X POST -s -w "%{http_code}" -u "${user}:${pass}" -o "response-job.txt" -D "respHeader-job.txt" -H "${header}" "${jobsRequest}" -d '{"jobType":"EXPORT_DATA","jobName":"'"${exportJobName}"'","parameters":{"exportFileName":"Export-'"${i}"'.zip","rowMembers":"'"${rowMembers}"'","columnMembers":"'"${columnMembers}"'","povMembers":"'"${povMembers}"'"}}')
echo "Started export job " $i " out of " $numberOfExportFiles
jobId=$(cat response-job.txt | grep -o '"jobId":[^, }]*' | cut -d':' -f2)
descriptiveStatus=$(cat response-job.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g')
jobIdRequest="${serverURL}/HyperionPlanning/rest/v3/applications/${applicationName}/jobs/${jobId}"
response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-jobstatus.txt" -D "respHeader-jobstatus.txt" -H "${header}" "${jobIdRequest}")
jobId=$(cat response-jobstatus.txt | grep -o '"jobId":[^, }]*' | cut -d':' -f2)
descriptiveStatus=$(cat response-jobstatus.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g')
while [[ "${descriptiveStatus}" == "Processing" ]]
do
echo "${descriptiveStatus}"
sleep 10
response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-jobstatus.txt" -D "respHeader-jobstatus.txt" -H "${header}" "${jobIdRequest}")
descriptiveStatus=$(cat response-jobstatus.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g')
done
echo "${descriptiveStatus}"
echo "Downloading export file ..."
contentsRequest="${serverURL}/interop/rest/11.1.2.3.600/applicationsnapshots/Export-${i}.zip/contents"
curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -D "respHeader-download.txt" "${contentsRequest}" > "${exportFilePrefix}-${i}.zip"
echo "Expanding archive ..."
unzip "${exportFilePrefix}-${i}.zip" -d "${exportFilePrefix}-${i}"
rm "${exportFilePrefix}-${i}.zip"
echo "Writing to outputfile ..."
if [[ -d "${exportFilePrefix}-${i}" ]]
then
find "${exportFilePrefix}-${i}" -name \*.csv | xargs cat | tail -n +2 >> "${exportFilePrefix}.csv"
fi
done
zip "${exportFilePrefix}.zip" "${exportFilePrefix}.csv"
echo "cleaning up ..."
find . -name "${exportFilePrefix}-*" | xargs rm -r
rm "${exportFilePrefix}.csv"
Pour exporter un grand nombre de cellules à partir d'un cube ASO (en mode "agregate storage"), procédez comme suit :
ASOCellExport.ps1 ou ASOCellExport.sh.Tableau 3-16 Valeurs de variable à inclure dans les scripts PowerShell et Bash
| Variable | Description |
|---|---|
user |
Domaine et nom d'utilisateur d'un administrateur de service au format DOMAIN.USER.
Exemples : Windows : Linux/UNIX : |
pass |
Mot de passe de l'administrateur de service ou emplacement du fichier de mots de passe crypté. Pour obtenir des informations sur la création d'un fichier de mots de passe cryptés, reportez-vous à la commande encrypt.
Exemples : Windows : Linux/UNIX : |
serverURL |
URL de l'environnement Oracle Fusion Cloud Enterprise Performance Management.
Exemples : Windows : Linux/UNIX : |
applicationName |
Nom d'une application Planning, Financial Consolidation and Close, Tax Reporting ou Enterprise Profitability and Cost Management.
Exemples : Windows : Linux/UNIX : |
cubeName |
Nom d'un cube dans l'application.
Exemples : Windows : Linux/UNIX : |
splitDimension |
Nom d'une dimension dont les membres sont utilisés pour fractionner l'export en groupes.
Exemples : Windows : Linux/UNIX : |
topLevelMemberForExport |
Nom d'un membre de la sous-hiérarchie de dimension dans laquelle la liste des membres de niveau 0 est créée.
Exemples : Windows : Linux/UNIX : |
exportJobName |
Nom d'un job existant de type Export de données. Les paramètres indiqués dans ce job seront remplacés par ceux définis dans le script.
Exemples : Windows : Linux/UNIX : |
exportFilePrefix |
Préfixe de nom de fichier utilisé pour identifier les fichiers générés par le job d'export.
Exemples : Windows : Linux/UNIX : |
columnMembers |
Colonnes de membres à inclure dans l'export.
Exemples : Windows : Linux/UNIX : |
povMembers |
Points de vue à inclure dans l'export. Les membres de PDV doivent inclure toutes les autres dimensions et peuvent comprendre les fonctions indiquées ci-dessous :
Exemples : Windows : Linux/UNIX : |
numberOfExportFiles |
Nombre de jobs à exécuter pour cette opération d'export. Si l'export échoue en raison de la limite de requête, augmentez le nombre.
Exemples : Windows : Linux/UNIX : |