PowerShell: Using AlphaFS to list files and folder longer than 260 characters and checking access

PowerShell is great. However, it has a couple of limitations – either by design or inheritance that are annoying to say the least. One commonly documented failing, which is inherited from the .NET framework is its inability to access files that have a total path length over 260 characters. Another limitation is the linear nature in which commands are executed.

The first issue here is a major issue, particularly when working with network file systems, roaming profiles or any area where longer path lengths exist. Having Mac or Linux users on your network means that path lengths over 260 characters are more likely, as both of these systems support long path names.

There is a very good library available which can help overcome the 260 character limit. It implements most of the .NET framework functions for accessing files and folders, without the path length limitation. It’s a great addition to any project that accesses files and folders.

I have been working on a project to migrate users who are still using roaming profiles to using folder redirection. Some scripting has been required to automate the process and minimise user interaction. This is being done using PowerShell. One of the components of the script involves finding how many files and folders existed, how big they are, and whether or not we had access to read them.

PowerShell could do this.

Get-ChildItem $path -Recurse -Force

can list all the files and the sizes (Length property). Piping that list to a

Get-Content -Tail 1 -ErrorAction SilentlyContinue -ErrorVariable $ReadErrors | Out-Null

will give you a variable that lists all files that have any errors. All good.

This command is susceptible to the path limit thought. It is also slow. Each item is processed in order, one at a time. Whilst getting just the end of a file is quick, this whole command still takes time. Running against a 200MB user profile, took it over 2 minutes to list all files with sizes into a variable and give me a list of files that have access denied. With over 2TB of user profiles to migrate, that was too long.

With this method out of the window, I looked at using some C# code that I could import. The .NET framework offers a host of solutions to processing this sort of data. I ended up with the function below. It uses the AlphaFS library to get details of the files and directories. This removed the limitation of the path length. Also, as I was using the .NET Framework, I could use File.Open(). This just opens the file without reading it. It still throws an access denied error if it cannot be read, just quicker. This whole process could then be combined into a Parallel For Each loop. Directories and files can be recursed concurrently. The result was a scan of a 200mb profile in around 10 seconds – a much more acceptable time.

The code could be used in a C# project, or in the format below it can be included in a PowerShell script. You will need to download the AlphaFS library and put it in an accessible location so that it can be included in your script.

# Start of File Details Definition
$RecursiveTypeDef = @"
using System;
using System.Collections;
using System.Collections.Generic;
using System.Data;
using System.Threading.Tasks;
using System.Diagnostics;
using System.Linq;

public class FileDetails
{
    public List GetRecursiveFileFolderList(string RootDirectory)
    {
        m_FileFolderList = new List();
        m_GetFileDetails(RootDirectory);
        return m_FileFolderList;
    }

    private List m_FileFolderList = new List();

    private void m_GetFileDetails(string DirectoryName)
    {
        List AllFiles = new List();
        List AllFolders = new List();

        FileInfo FI = new FileInfo();
        FI.FileName = DirectoryName;
        FI.Type = Type.Directory;
        FI.FileSize = 0;
        FI.ReadSuccess = true;
        try {
            AllFiles = Alphaleonis.Win32.Filesystem.Directory.GetFiles(DirectoryName).ToList();
        } catch {
            FI.ReadSuccess = false;
        }
        try {
            AllFolders = Alphaleonis.Win32.Filesystem.Directory.GetDirectories(DirectoryName).ToList();
        } catch {
            FI.ReadSuccess = false;
        }
        lock (m_FileFolderList) {
            m_FileFolderList.Add(FI);
        }

        Parallel.ForEach(AllFiles, File =>
        {
            FileInfo FileFI = new FileInfo();
            FileFI.FileName = File;
            FileFI.Type = Type.File;
            try {
                FileFI.FileSize = Alphaleonis.Win32.Filesystem.File.GetSize(File);
                FileFI.ReadSuccess = true;
            } catch {
                FileFI.ReadSuccess = false;
            }
            lock (m_FileFolderList) {
                m_FileFolderList.Add(FileFI);
            }
        });

        Parallel.ForEach(AllFolders, Folder => { m_GetFileDetails(Folder); });
    }

    public struct FileInfo
    {
        public long FileSize;
        public string FileName;
        public Type Type;
        public bool ReadSuccess;
    }

    public enum Type
    {
        Directory,
        File
    }
}
"@

#Update the following lines to point to your AlphaFS.dll file.
Add-Type -Path $PSScriptRoot\AlphaFS.dll
Add-Type -TypeDefinition $RecursiveTypeDef -ReferencedAssemblies "$PSScriptRoot\AlphaFS.dll", System.Data

# End of File Details Definition

# Use of the function: 
$FileInfo = New-Object FileDetails
$Info = $FileInfo.GetRecursiveFileFolderList("C:\Windows")
$Info | Format-Table -Autosize -Wrap

This will output a full file and directory list of the C:\Windows directory. The property ReadSuccess is true if the file could be opened for reading.

Plenty of scope to modify this to meet your needs if they are something different, but an example of how you can bring in the power of the .NET Framework into PowerShell to help really boost some of your scripts.

2 people found this post useful.


7 thoughts on “PowerShell: Using AlphaFS to list files and folder longer than 260 characters and checking access

  1. Try to use Long Path Tool to list files and folder longer than 260 characters. It is also powerful tool to shorten URL links, file extension, and path issue.

    1. Thanks. That tool is useful for those not looking to script actions. Thanks for the feedback though.

  2. I tried your code, but ran into errors – only difference is the dll’s path.

    Add-Type : blabla\Temp\2\2nxlclqs.0.cs(11) : Using the generic type ‘System.Collections.Generic.List’
    requires 1 type arguments
    blabla\Temp\2\2nxlclqs.0.cs(10) : {
    blabla\Temp\2\2nxlclqs.0.cs(11) : >>> public List GetRecursiveFileFolderList(string RootDirectory)
    blabla\Temp\2\2nxlclqs.0.cs(12) : {
    At blibli\Pur_Alpha_from_Inet.pq1.ps1:83 char:1

    I’m on PS 3 / .Net 4

    I had to change every list definition with the correct type used. Like :
    public List GetRecursiveFileFolderList(string RootDirectory)
    or
    private List m_FileFolderList = new List();

    Then, it ran OK.

    Thanks, it helped me a lot.

    1. Thanks for the feedback. I have attempted to reproduce the problem that you have mentioned, but I can’t. Glad you got it to work though.

    2. I too have this exact error and cannot get this to run unfortunately and I wish your comment that you fixed it actually showed a line of code that you fixed as an example but all you listed there was verbatim what’s in the broken code above. :(

      So, what was the fix to make this work? I’m on PS5 Win7x64 dotnet 4.62.

    1. Thanks, I vaguely remember trying the enumerate methods, but I think I ran into an issue with returning some of the data that I wanted. I suppose I could have used an enumerate followed by a Get on the specific files that I wanted, but this seems a duplication of effort. This part is actually a slight subset of the original function that I wrote.

      I fully agree that Enumerate should be a better technical solution though – as long as it provides all of the information that you need.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.