이 섹션의 PowerShell 또는 Bash 스크립트를 사용하여 ASO(집계 저장영역) 큐브에서 다수의 셀을 익스포트할 수 있습니다.
Oracle Essbase QUERYRESULTLIMIT
를 통해 적용되는 한도로 인해 사용자 인터페이스에서 대량 데이터를 익스포트할 수 없습니다. 이 섹션에 제공된 PowerShell 스크립트는 익스포트 작업을 지정된 개수의 작업으로 분할하고 각 작업을 실행한 다음, 익스포트된 데이터를 다운로드하고 머리글이 하나만 있도록 여러 익스포트 파일을 하나의 익스포트 파일로 연결합니다.
PowerShell 스크립트
$user = '<USERNAME>' $pass = '<PASSWORD>' $serverURL = '<URL>' $applicationName = '<APPLICATIONNAME>' $cubeName = '<CUBENAME>' $splitDimension = '<DIMENSION_TO_SPLIT_THE_EXPORT>' $topLevelMemberForExport = '<TOP_MEMBER_FOR_EXPORT>' $exportJobName = '<EXPORT_JOB_NAME>' $exportFilePrefix = '<PREFIX_FOR_EXPORT_FILE>' $columnMembers = '<MEMBERS_ON_COLUMNS>' $povMembers = '<POV_MEMBERS>' $numberOfExportFiles = <NUMBER_OF_FILES_TO_SPLIT_THE_EXPORT> $memberArray = @() $exportFileArray = @() function getLevel0 ($parent) { $parent.children.ForEach({ if ( $_.children.count -eq 0 ) { $script:memberArray += $_.name } getLevel0($_) }) } function findMember ($tree, $memberName) { $subtree = "" if ($tree.name -eq $memberName){ return $tree } else { $tree.children.ForEach({ #Write-Host $_.name if ($subtree -eq ""){ $subtree = findMember $_ $memberName} }) return $subtree } } #putting together base64 encoded authentication header based un user and password $encodedCredentials = [Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes($($user) + ":" + $($pass))) $headers = @{ Authorization = "Basic $encodedCredentials" } #test login $testRequest = $serverURL + '/HyperionPlanning/rest/v3/applications' try { $response = Invoke-RestMethod -Uri $testRequest -Method Get -Headers $headers -UseBasicParsing } catch { Write-Host $_ return } #retrieve dimension hierarchy from application Write-Host "Retrieving member list for split dimension " $splitDimension $request = $serverURL + '/HyperionPlanning/rest/v3/internal/applications/' + $applicationName + '/plantypes/' + $cubeName + '/dimensions/' + $splitDimension try { $response = Invoke-RestMethod -Uri $request -Method Get -Headers $headers -UseBasicParsing } catch { Write-Host $_ return } Write-Host $splitDimension " member list retrieved" #search for the top of the export hierarchy Write-Host "Searching for member " $topLevelMemberForExport " in hierarchy" $member = findMember $response $topLevelMemberForExport if ( $member.name -ne $topLevelMemberForExport ) { Write-Host $topLevelMemberForExport " not found in hierarchy, exiting ..." return 128 } Write-Host "Found member " $topLevelMemberForExport " in hierarchy" #retrieve level 0 memebers in export hierarchy Write-Host "Retrieving Level 0 members for hierarchy" getLevel0($member) if ( $memberArray.Length -eq 0 ) { Write-Host "no level 0 members found in hierarchy, exiting ..." return 128 } Write-Host $memberArray.Length " Level 0 members for export hierarchy retrieved" $request = $serverURL + '/HyperionPlanning/rest/v3/applications/' + $applicationName + '/jobs' #splitting member list into the number of export files $numberOfEntitiesPerFile = [math]::truncate($memberArray.Length / $numberOfExportFiles) for ($i = 1; $i -le $numberOfExportFiles; $i++) { $memberList = "" $firstMember = ($i - 1) * $numberOfEntitiesPerFile if ($i -lt $numberOfExportFiles) { $lastMember = $i * $numberOfEntitiesPerFile } else { $lastMember = $i * $numberOfEntitiesPerFile + $memberArray.Length % $numberOfExportFiles } for ($j = $firstMember; $j -lt $lastMember; $j++) { $memberList += $memberArray[$j] if ($j -lt $lastMember - 1) {$memberList += ","} #avoid adding a comma (,) after the last member of each set } $jobDetails=' { "jobType":"EXPORT_DATA","jobName":"' + $exportJobName + '", "parameters":{ "exportFileName":"Export-' + $i + '.zip", "rowMembers":"' + $memberList + '", "columnMembers":"' + $columnMembers + '", "povMembers":"' + $povMembers + '" } }' #start export job try{ $response = Invoke-RestMethod -Uri $request -Method Post -Headers $headers -Body $jobDetails -ContentType "application/json"} catch { Write-Host $_ return } Write-Host "Started export job " $i " out of " $numberOfExportFiles #checking job status, continue once jos is completed $statusRequest = $serverURL + '/HyperionPlanning/rest/v3/applications/' + $applicationName + '/jobs/' + $response.jobId $statusResponse = Invoke-RestMethod -Uri $statusRequest -Method Get -Headers $headers -UseBasicParsing while ( $statusResponse.descriptiveStatus -eq "Processing" ) { Write-Host $statusResponse.descriptiveStatus Start-Sleep -s 10 $statusResponse = Invoke-RestMethod -Uri $statusRequest -Method Get -Headers $headers -UseBasicParsing } Write-Host $statusResponse.descriptiveStatus Write-Host "Downloading export file ..." $downloadRequest = $serverURL + '/interop/rest/11.1.2.3.600/applicationsnapshots/Export-' + $i + '.zip/contents' $statusResponse = Invoke-RestMethod -Uri $downloadRequest -Method Get -Headers $headers -OutFile "$exportFilePrefix-$i.zip" Write-Host "Expanding archive ..." Expand-Archive -Force -LiteralPath "$exportFilePrefix-$i.zip" -DestinationPath "$exportFilePrefix-$i" Remove-Item "$exportFilePrefix-$i.zip" Get-ChildItem -Path "$exportFilePrefix-$i" -File -Name | ForEach-Object { $exportFileArray += "$exportFilePrefix-$i\" + $_ } } Write-Host "creating outputfile ..." #write header to outputfile Get-Content $exportFileArray[0] | Select-Object -First 1 | Out-File "$exportFilePrefix.csv" #write content to outputfile skipping header ForEach ($exportFile in $exportFileArray) { Get-Content $exportFile | Select-Object -Skip 1 | Out-File -Append "$exportFilePrefix.csv" } Compress-Archive -LiteralPath "$exportFilePrefix.csv" -DestinationPath "$exportFilePrefix.zip" Write-Host "cleaning up ..." Remove-Item "$exportFilePrefix-*" -Recurse Remove-Item "$exportFilePrefix.csv"
Bash 스크립트
#!/bin/bash user='<USERNAME>' pass='<PASSWORD>' serverURL='<URL>' applicationName='<APPLICATIONNAME>' cubeName='<CUBENAME>' splitDimension='<DIMENSION_TO_SPLIT_THE_EXPORT>' topLevelMemberForExport='<TOP_MEMBER_FOR_EXPORT>' exportJobName='<EXPORT_JOB_NAME>' exportFilePrefix='<PREFIX_FOR_EXPORT_FILE>' columnMembers='<MEMBERS_ON_COLUMNS>' povMembers='<POV_MEMBERS>' numberOfExportFiles=<NUMBER_OF_FILES_TO_SPLIT_THE_EXPORT> getRowMembers() { local memberList="$1" local firstMember=$2 local lastMember=$3 local nameCount=0 local rowMember="" local rowMembers="" while IFS= read -r line do if [[ "${line}" == *"name"* ]] then if [[ ${nameCount} -ge ${firstMember} ]] && [[ ${nameCount} -lt ${lastMember} ]] then rowMember=$(echo "${line}" | cut -d':' -f2- | sed s'/[",]//g') rowMembers="${rowMembers}${rowMember}," fi ((nameCount+=1)) fi done <<< "${memberList}" rowMembers=$(echo "${rowMembers}" | rev | cut -d',' -f2- | rev) echo "${rowMembers}" } getLevel0() { local memberList="$1" local names=$(echo "${memberList}" | jq 'recurse (try .children[]) | .name' | sed -e 's/"//g') local elements="" formerIFS=$IFS IFS=$'\n' namesarr=($names) IFS=$formerIFS for i in ${!namesarr[@]} do testelement=$(echo "${memberList}" | jq --arg currentName "${namesarr[i]}" 'recurse (try .children[]) | select(.name==$currentName)') if [[ "${testelement}" != *"children"* ]] then elements="${elements}${testelement}" fi done echo "${elements}" } #test login header="Content-Type: application/x-www-form-urlencoded" applicationsRequest="${serverURL}/HyperionPlanning/rest/v3/applications" response=$(curl -X "GET" -s -w "%{http_code}" -u "${user}:${pass}" -H "${header}" "${applicationsRequest}") http_response_code=$(echo "${response}" | rev | cut -d'}' -f1 | rev) if [ ${http_response_code} -ne 200 ] then echo "${response}" exit fi #retrieve dimension hierarchy from application echo "Retrieving member list for split dimension ${splitDimension}" splitDimensionRequest="${serverURL}/HyperionPlanning/rest/v3/internal/applications/${applicationName}/plantypes/${cubeName}/dimensions/${splitDimension}" response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-memberlist.txt" -D "respHeader-memberlist.txt" -H "${header}" "${splitDimensionRequest}") http_response_code=$(echo "${response}" | rev | cut -d'}' -f1 | rev) if [ ${http_response_code} -ne 200 ] then echo "${response}" exit fi echo "${splitDimension} member list retrieved" #search for the top of the export hierarchy echo "Searching for member ${topLevelMemberForExport} in hierarchy" memberList=$(cat response-memberlist.txt | jq --arg topLevelMember "${topLevelMemberForExport}" 'recurse(try .children[]) | select (.name == $topLevelMember)') if [[ "${memberList}" == "" ]] then echo "${topLevelMemberForExport} not found in hierarchy, exiting ..." exit 128 fi echo "Found member ${topLevelMemberForExport} in hierarchy" #retrieve level 0 members in export hierarchy echo "Retrieving Level 0 members for hierarchy" totalCount=$(echo "${memberList}" | grep "name" | wc -l) grepChildrenCount=$(echo "${memberList}" | grep "children" | wc -l) levelZeroCount=$((totalCount-grepChildrenCount)) if [[ "${levelZeroCount}" -eq 0 ]] then echo "no level 0 members found in hierarchy, exiting ..." exit 128 fi echo "${levelZeroCount} Level 0 members for export hierarchy retrieved" #splitting member list into the number of export files numberOfEntitiesPerFile=$((levelZeroCount/numberOfExportFiles)) jobsRequest="${serverURL}/HyperionPlanning/rest/v3/applications/${applicationName}/jobs" header="Content-Type: application/json" for ((i = 1 ; i <= ${numberOfExportFiles}; i++)) do firstMember=$((($i-1)*numberOfEntitiesPerFile)) if [[ ${i} -lt ${numberOfExportFiles} ]] then lastMember=$((i*numberOfEntitiesPerFile)) else lastMember=$((i*numberOfEntitiesPerFile+levelZeroCount%numberOfExportFiles)) fi elements=$(getLevel0 "${memberList}") rowMembers=$(getRowMembers "${elements}" ${firstMember} ${lastMember}) response=$(curl -X POST -s -w "%{http_code}" -u "${user}:${pass}" -o "response-job.txt" -D "respHeader-job.txt" -H "${header}" "${jobsRequest}" -d '{"jobType":"EXPORT_DATA","jobName":"'"${exportJobName}"'","parameters":{"exportFileName":"Export-'"${i}"'.zip","rowMembers":"'"${rowMembers}"'","columnMembers":"'"${columnMembers}"'","povMembers":"'"${povMembers}"'"}}') echo "Started export job " $i " out of " $numberOfExportFiles jobId=$(cat response-job.txt | grep -o '"jobId":[^, }]*' | cut -d':' -f2) descriptiveStatus=$(cat response-job.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g') jobIdRequest="${serverURL}/HyperionPlanning/rest/v3/applications/${applicationName}/jobs/${jobId}" response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-jobstatus.txt" -D "respHeader-jobstatus.txt" -H "${header}" "${jobIdRequest}") jobId=$(cat response-jobstatus.txt | grep -o '"jobId":[^, }]*' | cut -d':' -f2) descriptiveStatus=$(cat response-jobstatus.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g') while [[ "${descriptiveStatus}" == "Processing" ]] do echo "${descriptiveStatus}" sleep 10 response=$(curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -o "response-jobstatus.txt" -D "respHeader-jobstatus.txt" -H "${header}" "${jobIdRequest}") descriptiveStatus=$(cat response-jobstatus.txt | grep -o '"descriptiveStatus":[^, }]*' | cut -d':' -f2 | sed -e 's/"//g') done echo "${descriptiveStatus}" echo "Downloading export file ..." contentsRequest="${serverURL}/interop/rest/11.1.2.3.600/applicationsnapshots/Export-${i}.zip/contents" curl -X GET -s -w "%{http_code}" -u "${user}:${pass}" -D "respHeader-download.txt" "${contentsRequest}" > "${exportFilePrefix}-${i}.zip" echo "Expanding archive ..." unzip "${exportFilePrefix}-${i}.zip" -d "${exportFilePrefix}-${i}" rm "${exportFilePrefix}-${i}.zip" echo "Writing to outputfile ..." if [[ -d "${exportFilePrefix}-${i}" ]] then find "${exportFilePrefix}-${i}" -name \*.csv | xargs cat | tail -n +2 >> "${exportFilePrefix}.csv" fi done zip "${exportFilePrefix}.zip" "${exportFilePrefix}.csv" echo "cleaning up ..." find . -name "${exportFilePrefix}-*" | xargs rm -r rm "${exportFilePrefix}.csv"
ASO(집계 저장영역) 큐브에서 많은 수의 셀을 익스포트하려면 다음을 수행합니다.
ASOCellExport.ps1
또는 ASOCellExport.sh
).표 3-15 PowerShell 및 Bash 스크립트에 포함할 변수 값
변수 | 설명 |
---|---|
user |
DOMAIN.USER 형식으로 지정된 서비스 관리자의 도메인 및 사용자 이름입니다.
예: Windows: Linux/UNIX: |
pass |
서비스 관리자의 비밀번호 또는 암호화된 비밀번호 파일의 위치. 암호화된 비밀번호 파일을 생성하는 방법에 대한 자세한 내용은 encrypt 명령을 참조하십시오.
예: Windows: Linux/UNIX: |
serverURL |
Oracle Enterprise Performance Management Cloud 환경의 URL입니다.
예: Windows: Linux/UNIX: |
applicationName |
Planning, Financial Consolidation and Close, Tax Reporting 또는 Enterprise Profitability and Cost Management 애플리케이션의 이름입니다.
예: Windows: Linux/UNIX: |
cubeName |
애플리케이션에 있는 큐브의 이름.
예: Windows: Linux/UNIX: |
splitDimension |
익스포트를 그룹으로 분할하는 데 해당 멤버가 사용되는 차원의 이름.
예: Windows: Linux/UNIX: |
topLevelMemberForExport |
레벨 0 멤버 목록이 생성되는 차원 하위 계층의 멤버 이름.
예: Windows: Linux/UNIX: |
exportJobName |
익스포트 데이터 유형의 기존 작업 이름. 이 작업에 지정된 설정은 스크립트에서 설정한 매개변수로 덮어씁니다.
예: Windows: Linux/UNIX: |
exportFilePrefix |
익스포트 작업으로 생성된 파일을 고유하게 확인하기 위한 파일 이름 접두어.
예: Windows: Linux/UNIX: |
columnMembers |
익스포트에 포함할 멤버 열.
예: Windows: Linux/UNIX: |
povMembers |
익스포트에 포함할 POV. POV 멤버에는 다른 모든 차원이 포함되어야 하며 아래에 표시된 것처럼 함수를 포함할 수 있습니다.
예: Windows: Linux/UNIX: |
numberOfExportFiles |
이 익스포트 작업에 대해 실행할 작업 수. 쿼리 제한으로 인해 익스포트가 계속 실패하는 경우 이 숫자를 늘리십시오.
예: Windows: Linux/UNIX: |