Use o script do PowerShell ou do Bash nesta seção para exportar um grande número de células de um cubo de Armazenamento Agregado (ASO).
Por causa dos limites impostos pelo Oracle Essbase QUERYRESULTLIMIT, é impossível exportar uma grande quantidade de dados pela interface de usuário. O script do PowerShell disponível nesta seção divide a operação de exportação de um número específico de jobs, execute cada job, faz download dos dados exportados e concatena os arquivos exportados em um único arquivo de exportação assegurando que só o cabeçalho exista.
Nota:
Esses scripts executam um job existente do tipo de exportação de dados. Para obter instruções detalhadas sobre como criar jobs, consulte "Gerenciamento de Jobs" em Administração do Planning.Script do PowerShell
$user = '<USERNAME>' $pass = '<PASSWORD>' $serverURL = '<URL>' $applicationName = '<APPLICATIONNAME>' $cubeName = '<CUBENAME>' $splitDimension = '<DIMENSION_TO_SPLIT_THE_EXPORT>' $topLevelMemberForExport = '<TOP_MEMBER_FOR_EXPORT>' $exportJobName = '<EXPORT_JOB_NAME>' $exportFilePrefix = '<PREFIX_FOR_EXPORT_FILE>' $columnMembers = '<MEMBERS_ON_COLUMNS>' $povMembers = '<POV_MEMBERS>' $numberOfExportFiles = <NUMBER_OF_FILES_TO_SPLIT_THE_EXPORT> $memberArray = @() $exportFileArray = @() function getLevel0 ($parent) { $parent.children.ForEach({ if ( $_.children.count -eq 0 ) { $script:memberArray += $_.name } getLevel0($_) }) } function findMember ($tree, $memberName) { $subtree = "" if ($tree.name -eq $memberName){ return $tree } else { $tree.children.ForEach({ #Write-Host $_.name if ($subtree -eq ""){ $subtree = findMember $_ $memberName} }) return $subtree } } #putting together base64 encoded authentication header based un user and password $encodedCredentials = [Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($($user) + ":" + $($pass))) $headers = @{ Authorization = "Basic $encodedCredentials" } #test login $testRequest = $serverURL + '/HyperionPlanning/rest/v3/applications' try { $response = Invoke-RestMethod -Uri $testRequest -Method Get -Headers $headers -UseBasicParsing } catch { Write-Host $_ return } #retrieve dimension hierarchy from application Write-Host "Retrieving member list for split dimension " $splitDimension $request = $serverURL + '/HyperionPlanning/rest/v3/internal/applications/' + $applicationName + '/plantypes/' + $cubeName + '/dimensions/' + $splitDimension try { $response = Invoke-RestMethod -Uri $request -Method Get -Headers $headers -UseBasicParsing } catch { Write-Host $_ return } Write-Host $splitDimension " member list retrieved" #search for the top of the export hierarchy Write-Host "Searching for member " $topLevelMemberForExport " in hierarchy" $member = findMember $response $topLevelMemberForExport if ( $member.name -ne $topLevelMemberForExport ) { Write-Host $topLevelMemberForExport " not found in hierarchy, exiting ..." return 128 } Write-Host "Found member " $topLevelMemberForExport " in hierarchy" #retrieve level 0 memebers in export hierarchy Write-Host "Retrieving Level 0 members for hierarchy" getLevel0($member) if ( $memberArray.Length -eq 0 ) { Write-Host "no level 0 members found in hierarchy, exiting ..." return 128 } Write-Host $memberArray.Length " Level 0 members for export hierarchy retrieved" $request = $serverURL + '/HyperionPlanning/rest/v3/applications/' + $applicationName + '/jobs' #splitting member list into the number of export files $numberOfEntitiesPerFile = [math]::truncate($memberArray.Length / $numberOfExportFiles) for ($i = 1; $i -le $numberOfExportFiles; $i++) { $memberList = "" $firstMember = ($i - 1) * $numberOfEntitiesPerFile if ($i -lt $numberOfExportFiles) { $lastMember = $i * $numberOfEntitiesPerFile } else { $lastMember = $i * $numberOfEntitiesPerFile + $memberArray.Length % $numberOfExportFiles } for ($j = $firstMember; $j -lt $lastMember; $j++) { $memberList += $memberArray[$j] if ($j -lt $lastMember - 1) {$memberList += ","} #avoid adding a comma (,) after the last member of each set } $jobDetails=' { "jobType":"EXPORT_DATA","jobName":"' + $exportJobName + '", "parameters":{ "exportFileName":"Export-' + $i + '.zip", "rowMembers":"' + $memberList + '", "columnMembers":"' + $columnMembers + '", "povMembers":"' + $povMembers + '" } }' #start export job try{ $response = Invoke-RestMethod -Uri $request -Method Post -Headers $headers -Body $jobDetails -ContentType "application/json"} catch { Write-Host $_ return } Write-Host "Started export job " $i " out of " $numberOfExportFiles #checking job status, continue once jos is completed $statusRequest = $serverURL + '/HyperionPlanning/rest/v3/applications/' + $applicationName + '/jobs/' + $response.jobId $statusResponse = Invoke-RestMethod -Uri $statusRequest -Method Get -Headers $headers -UseBasicParsing while ( $statusResponse.descriptiveStatus -eq "Processing" ) { Write-Host $statusResponse.descriptiveStatus Start-Sleep -s 10 $statusResponse = Invoke-RestMethod -Uri $statusRequest -Method Get -Headers $headers -UseBasicParsing } Write-Host $statusResponse.descriptiveStatus Write-Host "Downloading export file ..." $downloadRequest = $serverURL + '/interop/rest/11.1.2.3.600/applicationsnapshots/Export-' + $i + '.zip/contents' $statusResponse = Invoke-RestMethod -Uri $downloadRequest -Method Get -Headers $headers -OutFile "$exportFilePrefix-$i.zip" Write-Host "Expanding archive ..." Expand-Archive -Force -LiteralPath "$exportFilePrefix-$i.zip" -DestinationPath "$exportFilePrefix-$i" Remove-Item "$exportFilePrefix-$i.zip" Get-ChildItem -Path "$exportFilePrefix-$i" -File -Name | ForEach-Object { $exportFileArray += "$exportFilePrefix-$i\" + $_ } } Write-Host "creating outputfile ..." #write header to outputfile Get-Content $exportFileArray[0] | Select-Object -First 1 | Out-File "$exportFilePrefix.csv" #write content to outputfile skipping header ForEach ($exportFile in $exportFileArray) { Get-Content $exportFile | Select-Object -Skip 1 | Out-File -Append "$exportFilePrefix.csv" } Compress-Archive -LiteralPath "$exportFilePrefix.csv" -DestinationPath "$exportFilePrefix.zip" Write-Host "cleaning up ..." Remove-Item "$exportFilePrefix-*" -Recurse Remove-Item "$exportFilePrefix.csv"
Script do Bash
#!/bin/bash
user='<USERNAME>'
pass='<PASSWORD>'
serverURL='<URL>'
applicationName='<APPLICATIONNAME>'
cubeName='<CUBENAME>'
splitDimension='<DIMENSION_TO_SPLIT_THE_EXPORT>'
topLevelMemberForExport='<TOP_MEMBER_FOR_EXPORT>'
exportJobName='<EXPORT_JOB_NAME>'
exportFilePrefix='<PREFIX_FOR_EXPORT_FILE>'
columnMembers='<MEMBERS_ON_COLUMNS>'
povMembers='<POV_MEMBERS>'
numberOfExportFiles=<NUMBER_OF_FILES_TO_SPLIT_THE_EXPORT>
getRowMembers() {
local memberList="$1"
local firstMember=$2
local lastMember=$3
local nameCount=0
local rowMember=""
local rowMembers=""
while IFS= read -r line
do
if [[ "${line}" == *"name"* ]]
then
if [[ ${nameCount} -ge ${firstMember} ]] && [[ ${nameCount} -lt ${lastMember} ]]
then
rowMember=$(echo "${line}" | cut -d':' -f2- | sed s'/[",]//g')
rowMembers="${rowMembers}${rowMember},"
fi
((nameCount+=1))
fi
done <<< "${memberList}"
rowMembers=$(echo "${rowMembers}" | rev | cut -d',' -f2- | rev)
echo "${rowMembers}"
}
getLevel0()
{
local memberList="$1"
local names=$(echo "${memberList}" | jq 'recurse (try .children[]) | .name' | sed -e 's/"//g')
local elements=""
formerIFS=$IFS
IFS=$'\n'
namesarr=($names)
IFS=$formerIFS
for i in ${!namesarr[@]}
do
testelement=$(echo "${memberList}" | jq --arg currentName "${namesarr[i]}" 'recurse (try .children[]) | select(.name==$currentName)')
if [[ "${testelement}" != *"children"* ]]
then
elements="${elements}${testelement}"
fi
done
echo "${elements}"
}
#test login
header="Content-Type: application/x-www-form-urlencoded"
applicationsRequest="${serverURL}/HyperionPlanning/rest/v3/applications"
response=$(curl -X "GET" -s -w "%{http_code}" -u "${user}:${pass}" -H "${header}" "${applicationsRequest}")
http_response_code=$(echo "${response}" | rev | cut -d'}' -f1 | rev)
if [ ${http_response_code} -ne 200 ]
then
echo "${response}"
exit
fi
#retrieve dimension hierarchy from application
echo "Retrieving member list for split dimension ${splitDimension}"
splitDimensionRequest="${serverURL}/HyperionPlanning/rest/v3/internal/applications/${applicationName}/plantypes/${cubeName}/dimensions/${splitDimension}"
response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-memberlist.txt" -D "respHeader-memberlist.txt" -H "${header}" "${splitDimensionRequest}")
http_response_code=$(echo "${response}" | rev | cut -d'}' -f1 | rev)
if [ ${http_response_code} -ne 200 ]
then
echo "${response}"
exit
fi
echo "${splitDimension} member list retrieved"
#search for the top of the export hierarchy
echo "Searching for member ${topLevelMemberForExport} in hierarchy"
memberList=$(cat response-memberlist.txt | jq --arg topLevelMember "${topLevelMemberForExport}" 'recurse(try .children[]) | select (.name == $topLevelMember)')
if [[ "${memberList}" == "" ]]
then
echo "${topLevelMemberForExport} not found in hierarchy, exiting ..."
exit 128
fi
echo "Found member ${topLevelMemberForExport} in hierarchy"
#retrieve level 0 members in export hierarchy
echo "Retrieving Level 0 members for hierarchy"
totalCount=$(echo "${memberList}" | grep "name" | wc -l)
grepChildrenCount=$(echo "${memberList}" | grep "children" | wc -l)
levelZeroCount=$((totalCount-grepChildrenCount))
if [[ "${levelZeroCount}" -eq 0 ]]
then
echo "no level 0 members found in hierarchy, exiting ..."
exit 128
fi
echo "${levelZeroCount} Level 0 members for export hierarchy retrieved"
#splitting member list into the number of export files
numberOfEntitiesPerFile=$((levelZeroCount/numberOfExportFiles))
jobsRequest="${serverURL}/HyperionPlanning/rest/v3/applications/${applicationName}/jobs"
header="Content-Type: application/json"
for ((i = 1 ; i <= ${numberOfExportFiles}; i++))
do
firstMember=$((($i-1)*numberOfEntitiesPerFile))
if [[ ${i} -lt ${numberOfExportFiles} ]]
then
lastMember=$((i*numberOfEntitiesPerFile))
else
lastMember=$((i*numberOfEntitiesPerFile+levelZeroCount%numberOfExportFiles))
fi
elements=$(getLevel0 "${memberList}")
rowMembers=$(getRowMembers "${elements}" ${firstMember} ${lastMember})
response=$(curl -X POST -s -w "%{http_code}" -u "${user}:${pass}" -o "response-job.txt" -D "respHeader-job.txt" -H "${header}" "${jobsRequest}" -d '{"jobType":"EXPORT_DATA","jobName":"'"${exportJobName}"'","parameters":{"exportFileName":"Export-'"${i}"'.zip","rowMembers":"'"${rowMembers}"'","columnMembers":"'"${columnMembers}"'","povMembers":"'"${povMembers}"'"}}')
echo "Started export job " $i " out of " $numberOfExportFiles
jobId=$(cat response-job.txt | grep -o '"jobId":[^, }]*' | cut -d':' -f2)
descriptiveStatus=$(cat response-job.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g')
jobIdRequest="${serverURL}/HyperionPlanning/rest/v3/applications/${applicationName}/jobs/${jobId}"
response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-jobstatus.txt" -D "respHeader-jobstatus.txt" -H "${header}" "${jobIdRequest}")
jobId=$(cat response-jobstatus.txt | grep -o '"jobId":[^, }]*' | cut -d':' -f2)
descriptiveStatus=$(cat response-jobstatus.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g')
while [[ "${descriptiveStatus}" == "Processing" ]]
do
echo "${descriptiveStatus}"
sleep 10
response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-jobstatus.txt" -D "respHeader-jobstatus.txt" -H "${header}" "${jobIdRequest}")
descriptiveStatus=$(cat response-jobstatus.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g')
done
echo "${descriptiveStatus}"
echo "Downloading export file ..."
contentsRequest="${serverURL}/interop/rest/11.1.2.3.600/applicationsnapshots/Export-${i}.zip/contents"
curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -D "respHeader-download.txt" "${contentsRequest}" > "${exportFilePrefix}-${i}.zip"
echo "Expanding archive ..."
unzip "${exportFilePrefix}-${i}.zip" -d "${exportFilePrefix}-${i}"
rm "${exportFilePrefix}-${i}.zip"
echo "Writing to outputfile ..."
if [[ -d "${exportFilePrefix}-${i}" ]]
then
find "${exportFilePrefix}-${i}" -name \*.csv | xargs cat | tail -n +2 >> "${exportFilePrefix}.csv"
fi
done
zip "${exportFilePrefix}.zip" "${exportFilePrefix}.csv"
echo "cleaning up ..."
find . -name "${exportFilePrefix}-*" | xargs rm -r
rm "${exportFilePrefix}.csv"
Par exportar um grande número de células de um cubo de Armazenamento Agregado (ASO):
ASOCellExport.ps1 ou ASOCellExport.sh.Tabela 3-16 Valores de Variáveis a Serem Incluídos nos Scripts do PowerShell e do Bash
| Variável | Descrição |
|---|---|
user |
O domínio e o nome de usuário de um Administrador de Serviço no formato DOMAIN.USER.
Exemplos: Windows: Linux/UNIX: |
pass |
Senha do Administrador de Serviço ou a localização do arquivo de senha criptografado. Consulte o comando encrypt para obter informações sobre a criação de um arquivo de senha criptografada.
Exemplos: Windows: Linux/UNIX: |
serverURL |
URL do ambiente Oracle Fusion Cloud Enterprise Performance Management.
Exemplos: Windows: Linux/UNIX: |
applicationName |
Nome de um aplicativo do Planning, Financial Consolidation and Close, Tax Reporting ou Enterprise Profitability and Cost Management.
Exemplos: Windows: Linux/UNIX: |
cubeName |
Nome de um Cubo no aplicativo.
Exemplos: Windows: Linux/UNIX: |
splitDimension |
Nome de uma dimensão dos membros dos quais são usados para dividir a exportação em grupos.
Exemplos: Windows: Linux/UNIX: |
topLevelMemberForExport |
Nome de um membro da sub-hierarquia da dimensão sob a qual a lista de membros de Nível 0 é criado.
Exemplos: Windows: Linux/UNIX: |
exportJobName |
Nome de um job existente do tipo Exportar Dados. As configurações especificadas neste job será substituída pelos parâmetros que você definiu no script.
Exemplos: Windows: Linux/UNIX: |
exportFilePrefix |
Um nome de prefixo que identifica de forma exclusiva os arquivos gerados pelo job de exportação.
Exemplos: Windows: Linux/UNIX: |
columnMembers |
As colunas de membros a serem incluídas na exportação.
Exemplos: Windows: Linux/UNIX: |
povMembers |
Pontos de Vista a serem incluídos na exportação. Os Membros de PDV devem incluir todas as outras dimensões e podem incluir funções, como mostrado a seguir:
Exemplos: Windows: Linux/UNIX: |
numberOfExportFiles |
Número de jobs a serem executados para essa operação de exportação. Se, ainda assim, a exportação falhar devido a limitações do limite de consulta, aumente esse número.
Exemplos: Windows: Linux/UNIX: |