-
Notifications
You must be signed in to change notification settings - Fork 6
3. Example
It is possible to create random logs to test the project just by running the commands below:
- Launch Elasticsearch and Kibana with the
docker-compose -f docker-compose.yaml -f docker-compose.elastic.yaml up -d elasticsearch kibana
command` - Load the Elastic Common Schema template running the
./load_templates.sh
script - Create the random test data launching the Python script from the buffalogs_module/examples folder:
python random_example.py
- Go on Kibana at
localhost:5601
→ Stack Management → Index Patterns and create a new Index pattern withcloud-*
as name and select@timestamp
for the timestamp field - Check: Now, you should be able to visualize 2000 Docs count at Stack management → Index Management for the
cloud-<today_date>
index
And you can analyze the logs data newly uploaded at localhost:5601
At this stage, it is your choice to run the application manually or automatically (using Celery).
First of all, you have to apply the migrations in order to propagate changes from your models into the database shema. To do that, launch ./manage.py makemigrations
for creating new migrations based on the changes made in the models, if some have been made. then, apply the migrations running ./manage.py migrate
.
To run the application manually, launch the management command below from buffalogs_module/buffalogs:
python manage.py impossible_travel
This command can also be launched with a specific time range which the detection will be taken:
python manage.py impossible_travel '2023-08-01 00:00:00' '2023-08-01 3:00:00'
You can also clear all the data saved in the database just running:
python manage.py clear_models
To run it in an automated way, just start up all the tools with:
sudo docker-compose up -d
In both cases, the results are available at localhost:80
Just the details of the logins with different user agents or countries