Recently I found a PowerQuery gem, trick with Java/VB Script that allows to export data from Power Query to CSV without R / DAX Studio / SMS and Registration. However, related with risk. As everything else in our life.

Kudos to user Shi Yang from Stack Overflow who replied to How to write to DATA sources with Power Query?.

Shi proposes to use following code (extended with my comments)

Let
     // reference to a query you wish to export to CSV
    Source = ReferenceToYourTableOrQuery,
     // demote headers to have headers in resulting CSV
     // if you don't need headers, remove Table.DemoteHeaders
     Json = Text.FromBinary(Json.FromValue(Table.ToRows(Table.DemoteHeaders(Source)))),
     // trigger execution of script
  Export = Web.Page("
 var fso=new ActiveXObject('Scripting.FileSystemObject');
 var f1=fso.CreateTextFile('C:/Temp/test.csv',true);
 var arr=" & Json & ";
 f1.WriteLine(arr.join('\n'));
 f1.WriteBlankLines(1);
 f1.Close();
 ")
 in Export

All great, but this method doesn’t work with default settings of Internet Explorer.

Yes… yes… IE is involved somehow.

To make it work, you need to open IE, go ‘Internet options


Tab ‘Security‘ then ‘Custom level…


Find setting ‘Initialize and script ActiveX controls not marked as safe for scripting (not secure)


Unfortunately, this is a risky option, as it potentially allows to execute any script on your workstation.

But if you really need to export something to CSV… you probably allowed to change this setting on your corporate laptop… right?

For your convenience, IE will show notification about security risk each time when you open it


You may click ‘Fix settings for me‘ to restore secure configuration.

Back to Export to CSV

Proposed code is basically a script that writes text file using CreateTextFile method

Export = Web.Page("
  var fso=new ActiveXObject('Scripting.FileSystemObject');
  var f1=fso.CreateTextFile('C:/Temp/test.csv',true);
  var arr=" & Json & ";
   f1.WriteLine(arr.join('\n'));
   f1.WriteBlankLines(1);
   f1.Close();
 ")

Script creates FileSystemObject needed to work with files as with objects, then creates TextFile (with overwrite option = true).

Then saves prepared Json into variable, adds line breaks and finally writes everything into text file.

Data Volume

First thing I did – tried to export something not small but not too big. It was a table about 200k rows and 15 columns. Script successfully exported data. Then I doubled volume – went well, tripled – no CSV file were created.

Obviously, limit on data volume exists. Json can’t be infinite large.

I also tried to export table with over one million of rows but got ‘Out of memory’ message.

It is clear that we will need paging if we want to export huge tables.

And when I hear ‘paging’ I think about List.Generate.

Pagination

Instead of CreateTextFile we will need OpenTextFile method, that also allows to create file, but in addition allows to append to existing file.

Idea of test – take a small table and try to record it to CSV several times.

What if Power Query will be so fast that create a conflict while recording to file, e.g. if PQ create multiple threads? – i thought.

And I decided to add a delay between appending.

We can start with a function

let
fScriptFunction = (tbl as table, FilePath as text) =>
  let
    ForAppending = 8,
    Json = Text.FromBinary(Json.FromValue(Table.ToRows(Table.DemoteHeaders(tbl) ) ) ),
    Export = Web.Page("
        var fso=new ActiveXObject('Scripting.FileSystemObject');
       // var objFile=fso.OpenTextFile('" & FilePath & "', " & Text.From(ForAppending) & ", true); 
       // <- with create new file // without creation of new file ->
       var objFile=fso.OpenTextFile('" & FilePath & "', " & Text.From(ForAppending) & ");
       var arr=" & Json & ";
       objFile.WriteLine(arr.join('\n'));
       objFile.WriteBlankLines(1);
       objFile.Close();
  ")
 in Export,

// Initial parameters
 ResultingFilePath = "C:/Temp/test.csv",
 buffer = Table.Buffer( MyTable ), // MyTable is another query in workbook
 Delay = 1,
 TestVolume = List.Generate(
      ()=> [i=1, Tbl=buffer, dummy=null, time=null ],
      each // do while
        [i] < 10,
     each // just write to CSV each time
  let
       /// your complex logic
       // export = fScriptFunction( buffer, ResultingFilePath )
       export = Function.InvokeAfter( ()=> fScriptFunction( buffer, ResultingFilePath ), 
                        #duration(0,0,0,Delay) )
  in [ i=[i]+1,
       Tbl = buffer, // could be table range // skip first N rows
        dummy = export, // to trigger script
        time = DateTime.LocalNow()
        ],
  each // result of iteration
     [ [i], [Tbl], [dummy], [time]] ),
 #"Converted to Table" = Table.FromList( TestVolume , Splitter.SplitByNothing(), null, null, 
           ExtraValues.Error),
 #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", 
          {"dummy", "time"}, {"dummy", "time"}),
 #"Expanded dummy" = Table.ExpandTableColumn(#"Expanded Column1", "dummy", 
          {"Caption", "Source", "ClassName", "Id", "Data"}, {"Caption", "Source", "ClassName", "Id", "Data"})
in
 #"Expanded dummy"

However, I couldn’t achieve stable number of rows in resulting file.

Even if I set Delay to 3 seconds or 5, I still get my table written to file two or three or five times.


Had no time to explore this further. Share as is.

I think that in Excel iteration can be done through VBA code that can change code of PQ queries…

You can download my sample file from here.

P.S. – not a single corporate laptop was at risk during writing of this post.

Leave a Reply