Zero out file solaris
The number of files currently opened by a certain process can be examined by:. The pfiles command displays the current limit of the number of open files for the process and more information about all currently open files of that process. See the following example. With that said, below are means of clearing file content from the command line. If you liked this article, then do subscribe to email alerts for Linux tutorials. If you have any questions or doubts?
Related Posts. Vinod There should be a way but we have to do a little research about this before we can give you a solution. How to specify this condition if log file size grows 2GB then clear one-month-old data.. Problem is file automatically create or delete Dynamic file are creating Reply. At the moment, I only used the redirect methods I mentioned at the start of my post. James There are other methods for completely removing file content on the disk such as shred command plus a few others.
Mike Many thanks for the useful tip, its surely simple as well. Bruno It is true, the basic idea is the use of the redirection mechanism. Got something to say? Join the discussion. Cancel reply Have a question or suggestion? Comment Name Email Save my name, email, and website in this browser for the next time I comment. If you know of any others, add them to the comments. To replace the contents of the file blah. This command makes sense as it is using standard UNIX redirection to place the contents of one file a known empty one to another file.
Another way to do this, although a less obvious way and probably less readable in a shell script by others, is. I want to delete 40 lines after every 20 lines. Please let me know how i can do it. Best regards, 11 Replies. Delete first lines from a BIG File.
Hi, I need a unix command to delete first n say lines from a log file. I need to delete some lines from the file without using any temporary file. I found sed -i is an useful command for this but its not supported in my environment AIX 6.
File size is approx MB. Thanks in In a huge file, Delete duplicate lines leaving unique lines. Hi All, I have a very huge file 4GB which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '! I dont know if this works : I want to read each line of the File in a For Loop, and want to Hiiii I have a file which contains huge data as a.
How to delete lines in a file that have duplicates or derive the lines that aper once. Input: a b b c d d I need: a c I know how to get this the lines that have duplicates : b d sort file uniq -d But i need opossite of this. I have searched the forum and other places as well, but have found solution for everything except this variant of the problem. I have a file with 28,00, lines of rows in this the first 80 lines will be chunks.
0コメント