web
You’re offline. This is a read only version of the page.
close
Skip to main content
Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Suggested answer

Best way to generate and return large XML files without saving locally? (RunBaseBatch)

(1) ShareShare
ReportReport
Posted on by 24
Hello everybody

I am currently generating an XML file inside X++ and I save it locally before streaming it to the user.

Here is the code:

XmlDocument xmlDoc;

xmlFileName = _nameFile+'.xml'; //
xmlFilePath = System.IO.Path::GetTempPath() + xmlFileName;
 
xmlDoc.save(xmlFilePath);
 
System.IO.MemoryStream memoryStream;

memoryStream = new System.IO.MemoryStream(System.IO.File::ReadAllBytes(xmlFilePath));
 
File::SendFileToUser(memoryStream, xmlFilename);
 

This works, but:

 Problem

In the future, the generated XML files may become very large.

Saving large files on the local disk is not the best option.

My Question

Is there an alternative solution that allows me to:

  • Generate and return the XML without saving it to a local file,

  • Or store it somewhere safer

  • And still download it to the user when needed?

 

Thanks for the help.

I have the same question (0)
  • André Arnaud de Calavon Profile Picture
    300,345 Super User 2025 Season 2 on at
    Best way to generate and return large XML files without saving locally? (RunBaseBatch)
    Hi,
     
    Can you please clarify if your question is related to Dynamics AX (2012 or before) or Dynamics 365 F&O? You used both tags, which is contradictory. Most likely your question is about F&O, but please confirm.

    Anyway, what is the reason to save it locally? You can directly get the XML content in a memory stream.
  • Suggested answer
    Martin Dráb Profile Picture
    237,114 Most Valuable Professional on at
    Best way to generate and return large XML files without saving locally? (RunBaseBatch)
    What is the best option depends on what you do with the XML document. Unfortunately, you gave us no description and your code snippet is completely missing this part.
     
    If you want to keep using XmlDocument class, notice that it's a wrapper around System.Xml.XmlDocument, which has a method for saving to a Stream. You can create an extension of XmlDocument and a method exposing document.Save().
     
    Nevertheless the idea that saving large files on disk sounds very strange to me. This is what disks are for. What you should be more concerned about is keeping large files in RAM, which is normally much smaller than disk space.
  • JR-26060826-0 Profile Picture
    24 on at
    Best way to generate and return large XML files without saving locally? (RunBaseBatch)
    Hi @Martin Dráb and @André Arnaud de Calavon

    Thanks for your awnsers
     

    I'm working on a custom feature in Dynamics 365 Finance & Operations where I need to generate a large XML file.

    This XML is a statutory reporting file called SAF-T (Standard Audit File for Tax purposes) — in practice it's just a very big XML with accounting and financial data.

    The process builds the entire XML in memory (thousands of nodes) and the final structure is stored inside an XmlDocument variable named xmlDoc.

     

    Example (simplified):

     
    XmlDocument xmlDoc = XmlDocument::newBlank();
    XmlElement root = xmlDoc.createElement("AuditFile");
    xmlDoc.appendChild(root);

    // Many sections here...
    // Customers, Suppliers, Ledgers, Tax entries, etc.
    // potentially tens of thousands of XML elements
     

    After building the full XML, I currently save it to a local temp file:

     
    xmlDoc.save(xmlFilePath);
     

    Then I load it again into a MemoryStream just to send it to the user.

    My goal is simply to avoid writing to local disk, because:


    • the XML can become very large in production environments

    • some customers have limited disk space on the AOS

    • developer VMs often run out of space very quickly


    •  

    So I just need a way to persist or stream the XML without creating a temp file.

  • Suggested answer
    Navneeth Nagrajan Profile Picture
    2,265 Super User 2025 Season 2 on at
    Best way to generate and return large XML files without saving locally? (RunBaseBatch)
     
    For large XML Files you can use a blob storage to host these files. In addition to this, you will need an Azure storage account with container created. You can reference the .NET Assemblies Azure.Storage.Blobs (recommended using since v40).
     
    class AzureBlobStorageHelper
    {
    /// <summary>
    /// Uploads an XML string to Azure Blob Storage using connection string (recommended)
    /// </summary>
    /// <returns>true if uploaded successfully</returns>
    public static boolean uploadFiletoBlobStorage(
    str _connectionString,
    str _containerName,
    str _blobName,
    str _xmlContent)
    {
    boolean ret = false;
     
    try
    {
    // These types are available natively in D365 FO (PU40+)
    Azure.Storage.Blobs.BlobServiceClient serviceClient;
    Azure.Storage.Blobs.BlobContainerClient containerClient;
    Azure.Storage.Blobs.BlobClient blobClient;
    System.IO.Stream stream;
    System.Text.Encoding utf8;
    System.Byte[] bytes;
     
    utf8 = System.Text.Encoding::get_UTF8();
    bytes = utf8.GetBytes(_xmlContent);
     
    // Create stream from bytes
    stream = new System.IO.MemoryStream(bytes);
     
    // Initialize clients
    serviceClient = new Azure.Storage.Blobs.BlobServiceClient(_connectionString);
    containerClient = serviceClient.GetBlobContainerClient(_containerName);
    blobClient = containerClient.GetBlobClient(_blobName);
     
    // Upload (overwrites if exists)
    blobClient.Upload(stream, true); // true = overwrite
     
    info(strFmt("XML uploaded successfully: %1/%2", _containerName, _blobName));
    ret = true;
    }
    catch (Exception::Error)
    {
    error("Failed to upload XML to blob storage. Check the connection parameters");
    }
    catch
    {
    error("<Error uploading to blob storage>");
    }
     
    return ret;
    }
     
    ///Sample
    public static void main(Args _args)
    {
    str connectiongstring = <Retrievefromparamtertable>;
    str containername =<blobstoragecontainerdetailsinparametertable>;
    str blobName = <blobnameparametertablewillhostthedetails>;
     
    str xmlContent = <xmlschema>
     
    AzureBlobStorageHelper::uploadFiletoBlobStorage(connectionstring,containername,blobname,xmlcontent);
    }
     
    Hope this helps. Happy to answer questions, if any.
  • Martin Dráb Profile Picture
    237,114 Most Valuable Professional on at
    Best way to generate and return large XML files without saving locally? (RunBaseBatch)
    I already gave you a solution.
     
    But your problem still make no sense to me. What size of the file you're talking about? Getting a few hundred GBs of disk space is easy and cheap. Saying that you'll rather obtain RAM of the same size sounds ridiculous. The actual problem is RAM and if you need to work with a huge amount of data, you need to split the work to smaller parts or off-load some data from RAM to... a disk!
  • André Arnaud de Calavon Profile Picture
    300,345 Super User 2025 Season 2 on at
    Best way to generate and return large XML files without saving locally? (RunBaseBatch)
    Hi,

    There is standard SAF-T available in Dynamics 365 Finance. Standard Audit File for Tax (SAF-T) electronic report - Finance | Dynamics 365 | Microsoft Learn
    Are you in one of the supported regions?
    If yes, are you able to use the standard?
    If not, you can check how Microsoft did build this in the standard using Electronic Reporting.
  • Navneeth Nagrajan Profile Picture
    2,265 Super User 2025 Season 2 on at
    Best way to generate and return large XML files without saving locally? (RunBaseBatch)
     
    For the developer VM, as Martin mentioned you can increase the disk size for hosting large files. If it is a cloud hosted VM (on your own Azure subscription with D365 FO tools) then you can increase your disk space on Azure. If it is a local VM then you can increase the disk space on your host VM and then create a shared folder to host these large XML Files there.  

    Alternatively, you can avoid writing to a local disk and write instead to a SharePoint site or to an Azure blob storage. XML files can be huge so can consider a transformation from csv to xml at the Azure logic apps level or consider using a PowerAutomate flow if cost per transaction is a challenge. Would rather have the file uploaded to a blob storage because in case of a production deployment, you will need all these configurations to be setup. The actual production level XML files won't be hosted in a local file storage. Plus, it will be cumbersome to have these XML files hosted in a local file storage. 
     

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Pallavi Phade – Community Spotlight

We are honored to recognize Pallavi Phade as our Community Spotlight honoree for…

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
Martin Dráb Profile Picture

Martin Dráb 696 Most Valuable Professional

#2
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 595 Super User 2025 Season 2

#3
CU05031448-0 Profile Picture

CU05031448-0 588

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans