At this point, sensors can start sending their readings to the Logstash data pipeline that we have created in the previous section. They just need to send the data as follows:
curl -XPOST -u sensor_data:sensor_data --header "Content-Type: application/json" "http://localhost:8080/" -d '{"sensor_id":1,"time":1512102540000,"reading":16.24}'
Since we don't have real sensors, we will simulate the data by sending these types of requests. The simulated data and script that send this data are incorporated in the code at https://github.com/pranav-shukla/learningelasticstack/tree/master/chapter-10/data.
If you are on Linux or macOS, open the Terminal and change the directory to your Learning Elasticstack workspace that was checked out from GitHub.
Now, go to the chapter-10/data directory and execute load_sensor_data.sh:
$ pwd
/Users/pranavshukla/workspace/learningelasticstack
$ cd chapter-10/data
$ ls
load_sensor_data.sh sensor_data.json
$ ./load_sensor_data.sh
The load_sensor_data.sh script reads the sensor_data.json line by line and submits to Logstash using the curl command we just saw.
We have just played one day's worth of sensor readings and taken every minute from different sensors across a few geographical locations to Logstash. The Logstash data pipeline that we built earlier should have enriched and sent the data to our Elasticsearch.
It is time to switch over to Kibana and get some insights from the data.