126

Is there any text editor, which can edit such file?

I've tried:

  • gedit
  • kate
  • nano
  • vim
  • mcedit

without success.

muru
  • 193,181
  • 53
  • 473
  • 722
cupakob
  • 1,859
  • 2
  • 17
  • 23
  • 12
    Do you need to edit it or just view? If the latter, you can simply use "less" from CLI. – Daniele Santi Mar 03 '11 at 12:21
  • 4
    @MrShunz: yes, i want to edit the file. @Bakhtiyor: the answer is "YES" :) – cupakob Mar 04 '11 at 06:14
  • 2
    I recommend editing your question to mention the fact that you want to edit the file. That would make it so people didn't have to search through comments to figure out your question and/or if your question is similar enough to one they have. – Firefeather Mar 09 '11 at 20:13
  • 1
    By any chance are you trying to open the 42 zip bomb base file? I had this problem and I found that a program called "010 editor" worked well –  May 09 '15 at 15:43
  • Related on Stack Overflow: [Working with huge files in linux](https://stackoverflow.com/questions/1591723/working-with-huge-files-in-linux) – Eliah Kagan Jan 11 '17 at 02:28
  • Am faced with the same problem right now. Just made the switch to Linux from Windows, and to my horror (and surprise), nothing seems to be nearly as good at opening/editing gigantic .txt files as the Windows-only EmEditor. So, terrible as it seems, I am now running EmEditor in a vm inside VirtualBox, and it works pretty well. I just opened a 2.5GB txt file, and EmEditor opened it immediately. I was able to do editing pretty fast (not at native speed, but not too annoying). Adding large chunks of text, F&R, etc. – Michael Beijer Apr 08 '20 at 17:34
  • I have found that programmatically manipulating large txt files works much quicker. Not a text editor approach, but worth considering. – demongolem Jul 23 '20 at 17:50
  • As time has moved on, things have become easier. I just edited a 3.4G file with vi and didn't even think about it; it just worked. Maybe took 10 seconds to save the file to an SSD. – James Moore Apr 02 '22 at 17:09

16 Answers16

117

Another method is to use split. Split the file into 8 pieces and manipulate the files with a editor. After that, you reassemble the files again.

split -b 53750k <your-file>

cat xa* > <your-file>


SYNOPSIS
       split [OPTION]... [INPUT [PREFIX]]

-a, --suffix-length=N
              use suffixes of length N (default 2)

       -b, --bytes=SIZE
              put SIZE bytes per output file

       -C, --line-bytes=SIZE
              put at most SIZE bytes of lines per output file

       -d, --numeric-suffixes
              use numeric suffixes instead of alphabetic

       -l, --lines=NUMBER
              put NUMBER lines per output file
Umesh .A Bhat
  • 145
  • 2
  • 10
schneehase
  • 1,567
  • 1
  • 9
  • 6
  • it seems to be the best solution at the moment... – cupakob Mar 04 '11 at 14:43
  • 14
    Take note that many editors **will add a newline character** to the end of your edited file, and do it *without informing you!* For more info see *How to stop Gedit, Gvim, Vim, Nano from adding End-of-File newline char?* http://askubuntu.com/q/20871/2670 – Peter.O Mar 23 '11 at 06:04
  • 1
    nice way ... and then you should use vim with the single parts ... I hate vim :P but It win all other editor here – Postadelmaga Dec 18 '12 at 00:12
  • 1
    @Peter.O: Did the link change? I'm having a hard time finding info on the newline character issue at that URL. :/ Update: Found the referenced question here: http://askubuntu.com/q/13317/372950 – rinogo Sep 10 '15 at 17:27
  • 7
    (In short, use `nano --nonewlines` to avoid the automatic addition of newlines) – rinogo Sep 10 '15 at 17:31
  • This is the only working method as well as the one suggested by @rinogo, nano is honestly the most fast and reliable editor to work with. Each of the rest is either complicated or bloated or have its own anti-features. – user6039980 Oct 25 '17 at 16:38
  • Thanks! This one helped a lot :) – alex May 20 '19 at 11:55
58

Try joe. I just used it to edit a ~5G SQL dump file. It took about a minute to open the file and a few minutes to save it, with very little use of swap (on a system with 4G RAM).

sierrasdetandil
  • 2,651
  • 1
  • 24
  • 25
24

you will not find them. If you want to replace some lines in this file, you can look at with less or grep and use sed to search and replace some lines.

like this:

sed -e 's/oldstuff/newstuff/g' inputFileName > outputFileName

on Wikipedia are some useful examples: http://en.wikipedia.org/wiki/Sed

schneehase
  • 1,567
  • 1
  • 9
  • 6
16

Give it a go, if you like, but such big files become impractical if you want to do "normal" editing; eg, you don't want to go saving your edits too often; it will take too long :)

If it's for a one off, split and join would work quite well, and it is simple enough to chop it up into managable chunks, and then rejoin the pieces... Take note that many editors will add a newline character to the end of your edited file, and do it without informing you! For more info see How to stop Gedit, Gvim, Vim, Nano from adding End-of-File newline char?

Try Gvim if you really want edit such a big file.... I've just loaded a 3.9GB file into it, and all seems to be normal...

Here is an interesting link on the matter, at stackoverflow

Peter.O
  • 24,311
  • 37
  • 113
  • 162
  • doesn't work with gvim.... – cupakob Mar 04 '11 at 14:43
  • 1
    @upakob: It just now successfully loaded a 4.5GB file on my system, using Gvim... It took 6 minutes to load. Did you wait long enough? (This is what I mean about saving the file. It will take a long time)... Try running `iotop` to watch its I/O stats as it is loading.. System Monitor shows I've got 3.2 GB of RAM (Which puzzles me, as I have 4 GB)... – Peter.O Mar 05 '11 at 08:25
  • 1
    @upakob: I've tried 8GB this time, and Gvim has successfully loaded it... So Gvim can "technicllly" handle big, Bigger, and maybe even the "BIGGEST" files, but even so, it is somewhat "impractical" (unless you are like me an prepared to wait 41 minute to load 8GB.. :) ... but I don't think I'll bother doing it again..... – Peter.O Mar 05 '11 at 09:45
  • You seem to have linked the wrong question. – psusi Apr 22 '14 at 14:54
  • You seem to have changed the first link to be the same as the second, rather than to "how to stop gedit, etc from adding end of file newline". – psusi Apr 24 '14 at 02:51
  • @psusi: fixed and double checked this time :) – Peter.O May 09 '14 at 23:33
15

There is another very simple and fast way to replace content in very large files (which works instead of editing large mysql dumps)

First of all you should install midnight comander - great file manager for linux systems

sudo apt-get install mc

After that you may open any file of any size in "view mode" (with F3 shortcut), switch to HEX view (F4 shortcut) and activate edit mode (F2 shortcut).

For example, I had 3 GB mysql dump, where I want to remove some SQL line. I open view mode, find string, open hex mode and replace content before needed line with MYSQL comment (string "-- ", hex codes 2D 2D 20).

Example: mc hex view

Fabby
  • 34,341
  • 38
  • 97
  • 191
Andrew Zhilin
  • 379
  • 3
  • 5
14

Use glogg - the fast, smart log explorer: http://glogg.bonnefon.org/

alex
  • 141
  • 1
  • 2
10

The nedit text editor has been around a long time and is quite capable. It can open a 1.9Gb text file in about 20 seconds. It's a windows-like graphical interface with all the standard text editing features you'd expect like syntax highlighting, indenting, line numbering, and so on.

If you want to resize the window, do that before opening the large file. The X11 Motif is a bit slow on the resize, but it's also a taxing request.

It's in all the standard repositories, so install with:

sudo apt-get install nedit

It is GPLv2 open source.

https://sourceforge.net/projects/nedit/

I'm wading through 30-40Mb text files and nedit handles them easily.

Marc Compere
  • 770
  • 9
  • 13
9

010 Editor is great for me, works very fast.

Frankie Drake
  • 191
  • 1
  • 2
5

According to tuxdiary:

HTH

Adam
  • 359
  • 4
  • 18
4

You can open the file using hexedit. However you will only be able to change text, not add or remove it.

ostrokach
  • 824
  • 8
  • 11
  • This is the right answer: You can edit files that are bigger than your virtual memory, because it edits in-place. – Ole Tange Mar 10 '20 at 16:46
3

We get into a situation where log file accumulated to 6GB and need to search by date or string. Few well known text editor could support for such a big file.

Found the JOE editor which is able to load my file of 6GB in 2 mins and enabled to explore the file.

Windows version (sourceforge.net)

Ubuntu (sourceforge.net)

SDsolar
  • 3,089
  • 9
  • 27
  • 48
praaveen V R
  • 131
  • 3
2

According to this Wikipedia article Comparison of text editors VIM among others. I was going to suggest Geany but there is a ? in the field for large file support...

EDIT: I went ahead and tried with geany and gave up after waiting 10 minutes with 3 cores pegged and basically all my memory (virtual and physical) in use the entire time... Not conclusive since it might have managed to open it if I'd been more patient. I looked for and did not find any settings/preferences for handling large files differently as well.

I like fred.bear's answer best.

sebikul
  • 1,911
  • 1
  • 16
  • 16
bumbling fool
  • 753
  • 1
  • 7
  • 19
2

In Windows you have TextPad, EditPad, EmEditor and Kainet.

In Linux you have
kinesics:
http://turtlewar.org/projects/editor/

and many hex editors such as:
bless
http://home.gna.org/bless/
or
wxhexeditor
http://wxhexeditor.sourceforge.net/home.php

All of them allow you to edit very large files (even terabyte) and you can do it easily, without needing to split and recombine the file, which is prone to error and cumbersome.

skan
  • 141
  • 3
0

I used madedit in the past, the only one that survives opening files more than my RAM.

https://github.com/madedit/madedit

Kokizzu
  • 461
  • 6
  • 19
0

Emacs will do the job (I've edited 10+GiB files in it before), but is approximately as unfriendly to the new user as vim, so may not sit your needs. The learning curve is pretty steep.

Darael
  • 440
  • 3
  • 9
-2

I work with NetBeans: it is better than Eclipse in that context.

I know that it is for developers, but you can open any plain text file with it.

TRiG
  • 1,960
  • 2
  • 18
  • 39
Abdennour TOUMI
  • 9,137
  • 9
  • 44
  • 51