Documentation forLoggly

Amazon S3 Log Ingestion

Loggly provides the infrastructure to aggregate and normalize log events so they are available to explore interactively, build visualizations, or create threshold-based alerting. In general, any method to send logs from a system or application to an external source can be adapted to send logs to Loggly. The following instructions provide one scenario for sending logs to Loggly.

Loggly can automatically retrieve new log files added to your S3 bucket(s). The Loggly service supports logs from ELB, ALB, and Cloudfront, and any uncompressed line-separated text files. The Loggly scripts configure all the settings automatically. Alternatively, you can choose to configure your AWS account yourself by following the manual setup instructions.

Loggly works by listening for Amazon events identifying a new object has been created in your bucket. To make the process of sending events reliable, Loggly sends them through Amazon’s Simple Queue Service (SQS), which saves the event until it can be retrieved. When Loggly receives a notification, the log file is downloaded and ingested into Loggly.

S3 Ingestion has a limit of 1GB max S3 file size. If a file exceeds 1GB, Loggly skips it.

Supported file formats: .txt, .gz, .json.gz, .zip, .log.

Any plain text or zipped file provided as S3File Metadata on AWS is text/plain or text/rtf or application/x-gzip or application/zip.

Overview of Amazon Simple Queue Service (SQS) and Loggly Integration

Amazon Simple Queue Service (SQS) is a fast, reliable, scalable, fully managed message queuing service. SQS makes it simple and cost-effective to decouple the components of a cloud application. You can use SQS to transmit any volume of data, at any level of throughput, without losing messages or requiring other services to be always available.

Whenever a new object is created in S3 buckets, S3 fires ObjectCreated events to the SQS queue. Loggly then retrieves that notification from the queue, which contains the key and bucket of the S3 object. It then downloads that object from S3 using an access key and secret access key that you provide.

Objects added to the S3 bucket before integration are not sent to the SQS queue.

To access your logs' S3 configuration page, select the Source Setup option in the side menu and click S3 Sources. Here you can create new AWS S3 log sources and manage existing ones.

The following information and options are available on this page.

  • Status
    • check Indicates the configuration is successful.
    • warning Indicates a problem needing urgent attention.
  • Enable
    Toggle the check box to enable or disable AWS source(s).
  • Path
    Location of the S3 logs.
  • Tags
    Tag(s) related to the S3 bucket.
  • Actions
    Edit or delete the corresponding source entry.

Adding a new AWS source

Adding a new AWS source can be accomplished by using the script option described below. If you prefer doing it manually, click the Manual tab and follow instructions for the manual setup.

Set up S3 log ingestion

From the logs' Source Setup, S3 Sources configuration page click Add New.

  1. Under Step 1, enter the name of the S3 bucket from which you would like to send logs. Here you can also choose to provide a prefix. 

    A prefix operates similarly to a folder. If you add a prefix here, only keys (or files) in that folder are ingested by Loggly. The prefix can also contain multiple folders separated by slashes, for example: loggly/2017/01.

    One prefix per bucket is allowed. If you change the prefix only the keys with the new prefix are ingested.

  2. Under Step 2, enter the AWS account ID for the account containing the bucket. Your account ID is used to set up the queue to send notifications about the new objects in your bucket. You can find your account ID on the AWS Security Credentials page.

    • If you have multiple active tokens, choose the customer token you would like to use to send logs to Loggly. For example, select the appropriate token from the drop-down field.

    • If you have only one active token, that token is used as default. Therefore, this step is not presented on the page if you have one active token.

  3. The script requires administrator access to configure your account. If you haven’t already, run the aws configure command found under Step 3, using administrator credentials. Also, install the AWS CLI tool if you haven’t already:

  4. Run the script below (also found under Step 4) to give Loggly access to read from your chosen S3 log bucket using the information below the script example.

    Set up an SQS queue so that Loggly receives notification when new objects are added to your S3 bucket.

    Your S3 bucket and SQS queue must be in the same region.

    curl -0 https://www.loggly.com/install/SQS3script.py
    python SQS3 script.py -- <your_bucket_name> --acnumber <your_account_number>
    • s3bucket: The name of the bucket from which you would like to send logs.

    • acnumber: Your AWS account number, which you can get from your AWS account page in the console.

    • user (optional): The IAM username that Loggly should use when accessing the queue and bucket. Please use a dedicated user for Loggly. The default is loggly-s3-user.

    • admin (optional): If you have more than one user set up with AWS CLI, please provide the user with administrative privileges in `.aws/credentials` in your home directory. The default is "default".

    • sqsname (optional): The name of the SQS queue where Loggly receives notifications when objects are added to the bucket. Use a dedicated queue for Loggly. The default is loggly-s3-queue.

  5. Under Step 4.1, enter the SQS Queue name from the script output.

  6. Under Step 4.2, enter the Access key ID and Secret access key from the script output.

  7. You may optionally provide one or more comma-separated tags that describe your data to make it easier to search:

  8. Click save after you have entered the information. You are now back to the AWS Sources page and if the configuration was successful, you can see a green checkmark for this source.

Troubleshooting Amazon S3 Ingestion

If you don’t see any data show up in the search tab, check for these common problems.

  • Wait a few minutes in case indexing needs to catch up.
  • Try the manual method if the script method doesn’t help.
  • Check if the AWS source is enabled under the AWS Sources tab.
  • Check the log files to make sure they exist and you have the right path.
  • The objects added previously are not sent to Loggly, so test by sending new logs only.
  • Check the Account overview page to see if you are exceeding the data volume limit per your subscription.
  • Check for errors on the page and correct them.
  • Search or post your own Amazon S3 Ingestion questions in the community forum.

Learn how Loggly can help with all your AWS Log Management