Sign Up

Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.

Have an account? Sign In

Have an account? Sign In Now

Sign In

Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.

Sign Up Here

Forgot Password?

Don't have account, Sign Up Here

Forgot Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Have an account? Sign In Now

You must login to ask a question.

Forgot Password?

Need An Account, Sign Up Here

Please briefly explain why you feel this question should be reported.

Please briefly explain why you feel this answer should be reported.

Please briefly explain why you feel this user should be reported.

Sign InSign Up

The Archive Base

The Archive Base Logo The Archive Base Logo

The Archive Base Navigation

  • SEARCH
  • Home
  • About Us
  • Blog
  • Contact Us
Search
Ask A Question

Mobile menu

Close
Ask a Question
  • Home
  • Add group
  • Groups page
  • Feed
  • User Profile
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Buy Points
  • Users
  • Help
  • Buy Theme
  • SEARCH
Home/ Questions/Q 195631
In Process

The Archive Base Latest Questions

Editorial Team
  • 0
Editorial Team
Asked: May 11, 20262026-05-11T16:41:55+00:00 2026-05-11T16:41:55+00:00

I have a really big log file (9GB — I know I need to

  • 0

I have a really big log file (9GB — I know I need to fix that) on my box. I need to split into chunks so I can upload it to amazon S3 for backup. S3 has a max file size of 5GB. So I would like to split this into several chunks and then upload each one.

Here is the catch, I only have 5GB on my server free so I can’t just do a simple unix split. Here is what I want to do:

  1. grab the first 4GB of the log file and spit out into a seperate file (call it segment 1)
  2. Upload that segment1 to s3.
  3. rm segment1 to free up space.
  4. grab the middle 4GB from the log file and upload to s3. Cleanup as before
  5. Grab the remaining 1GB and upload to S3.

I can’t find the right unix command to split with an offset. Split only does things in equal chunks and csplit doesn’t seem to have what I need either. Any recommendations?

  • 1 1 Answer
  • 0 Views
  • 0 Followers
  • 0
Share
  • Facebook
  • Report

Leave an answer
Cancel reply

You must login to add an answer.

Forgot Password?

Need An Account, Sign Up Here

1 Answer

  • Voted
  • Oldest
  • Recent
  • Random
  1. Editorial Team
    Editorial Team
    2026-05-11T16:41:55+00:00Added an answer on May 11, 2026 at 4:41 pm

    One (convoluted) solution is to compress it first. A textual log file should easily go from 9G to well below 5G, then you delete the original, giving you 9G of free space.

    Then you pipe that compressed file directly through split so as to not use up more disk space. What you’ll end up with is a compressed file and the three files for upload.

    Upload them, then delete them, then uncompress the original log.

    =====

    A better solution is to just count the lines (say 3 million) and use an awk script to extract and send the individual parts:

    awk '1,1000000 {print}' biglogfile > bit1
    # send and delete bit1
    
    awk '1000001,2000000 {print}' biglogfile > bit2
    # send and delete bit2
    
    awk '2000001,3000000 {print}' biglogfile > bit3
    # send and delete bit3
    

    Then, at the other end, you can either process bit1 through bit3 individually, or recombine them:

    mv bit1 whole
    cat bit2 >>whole ; rm bit2
    cat bit3 >>whole ; rm bit3
    

    And, of course, this splitting can be done with any of the standard text processing tools in Unix: perl, python, awk, head/tail combo. It depends on what you’re comfortable with.

    • 0
    • Reply
    • Share
      Share
      • Share on Facebook
      • Share on Twitter
      • Share on LinkedIn
      • Share on WhatsApp
      • Report

Sidebar

Ask A Question

Stats

  • Questions 134k
  • Answers 134k
  • Best Answers 0
  • User 1
  • Popular
  • Answers
  • Editorial Team

    How to approach applying for a job at a company ...

    • 7 Answers
  • Editorial Team

    How to handle personal stress caused by utterly incompetent and ...

    • 5 Answers
  • Editorial Team

    What is a programmer’s life like?

    • 5 Answers
  • Editorial Team
    Editorial Team added an answer Where is the DataContext being set then? Does it happen… May 12, 2026 at 6:50 am
  • Editorial Team
    Editorial Team added an answer I can not reproduce your error. This is what I… May 12, 2026 at 6:50 am
  • Editorial Team
    Editorial Team added an answer Another idea would be to utilize dependency injection. public interface… May 12, 2026 at 6:50 am

Related Questions

I have a really big log file (9GB -- I know I need to
I need to read in a rolling log file in .NET 3.5sp1. I'm wondering
Our software shop does a big enterprisey system and one of its part is
I am doing some long simulations that can take from several hours to several
I have a live search on my help page that searches our help database

Trending Tags

analytics british company computer developers django employee employer english facebook french google interview javascript language life php programmer programs salary

Top Members

Explore

  • Home
  • Add group
  • Groups page
  • Communities
  • Questions
    • New Questions
    • Trending Questions
    • Must read Questions
    • Hot Questions
  • Polls
  • Tags
  • Badges
  • Users
  • Help
  • SEARCH

Footer

© 2021 The Archive Base. All Rights Reserved
With Love by The Archive Base

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.