Say a directory or folder's path on a website is https://superuser.xyz/images/, but you don't know this right away. Ordinarily, webmasters don't make browsing of sub-folders available, i.e. the images folder would have no index.html file, so it would simply return a "folder not found" error if someone were to enter the URL in Chrome directly through guessing or sourcing any images' path.
Also, even if the directory were accessible through an index.html file, and you right click on that webpage and press Inspect or View Page Source Code, you could find the folder and its contents, but you can only save individual files in it one at a time in the Inspect view panel, which is inefficient.
In google Chrome windows 10, how do you download all batch contents of an online directory all at once, rather than one-by-one?