I am currently trying to setup my own little SIEM, but cannot seem to transfer the files how I want them.
I am capturing the data with tshark:
tshark -i eth0 -T ek > cap.pcap.json
This outputs a json file that is suited for ElasticSearch.
I have (at least I think so) managed to upload the data to my ElasticSearch by using curl.
curl -H "Content-Type: application/json" -XPOST "localhost:9200/_bulk?pretty" --data-binary @cap.pcap.json
This seems to work, I get an output (json) at the console that shows me where the data is located in Elastic-Search (_index,_type,_version, _shards, …).
But here, I am stuck. I’d like to import the data to Kibana (full ELK-Stack is installed), but cannot seem to find the right setting anywhere…
At the Dashboard (kibana), it is possible to “Add Data to Kibana” for a SIEM, but I don’t find a JSON option, “merely” zeek and auditbeat (…). Can anyone help me or point me to a site where it is explained. I searched, but found just things like: And once you have it in Elastic-Search, it is in Kibana…
How can I secure elasticsearch for production use in Docker?
I use this
version: '2' services: elasticsearch: image: docker.elastic.co/elasticsearch/elasticsearch:5.6.16 container_name: elasticsearch restart: unless-stopped environment: - "network.host=0.0.0.0" - "http.port=9200" - "cluster.name=elasticsearch" - "node.name=db-master" - "node.master=true" - "node.data=true" - "bootstrap.memory_lock=true" - "ES_JAVA_OPTS=-Xms6g -Xmx6g" - xpack.security.enabled=false ulimits: memlock: soft: -1 hard: -1 mem_limit: 12g volumes: - esdata:/usr/share/elasticsearch/data ports: - 127.0.0.1:9200:9200 networks: - esnet volumes: esdata: driver: local networks: esnet:
I want elasticsearch to be accessible only on localhost network (only local apps should access it), so it shouldn’t be accessible from internet. I use bind to localhost
- 127.0.0.1:9200:9200, but I don’t know if it is enough.
Could someone kindly direct me to suitable documentation, Wolfram or otherwise, to connect Mathematica to an ElasticSearch instance. I don’t need anything complicated just sufficient to write a record of two numbers and a space separated string which will need tokenising. An improvement would be if I could write out a list of such, around 100mb in size, in one call to ElasticSearch.
Dedicate Server 16GB RAM 4 Core Processor Centos Magento 2.2.2 Elastiscearch 6.4
I am trying to run reindex from command line ssh putty php bin/magento indexer:reindex catalogsearch_fulltext
I have elasticsearch 6.4 installed and setup in admin and Test Shows Successful
However when I go to run the reindex it takes many hours and eventually the ssh temrminal disconnects and crashes the whole server and all sites on the server. Only way I can access is by rebooting the server. Otherwise it shows nginx 504 gateway timeout for all sites and also WHM and CPANEL
I managed to work through some of the errors showing previously in elasticsearch log but nothing seems to fix this problem and Ive spent 3 weeks on this.
I am using Elasticsearch 6.8.1 with magento 2.3.2 instance. Although in English store my search is working fine, in Greek store I am having some issues with the words having accents. This happens only if Elasticsearch is used. With mysql everything is fine
room in Greek is “δωμάτιο”
notice “ά” letter has an accent
If you search:
“δωμ” you get right results including names with “δωμάτιο”(room)
But if you search:
“δωμάτιο” or “δωμάτιο” or “δώμα”
You don’t get any result with this word at all.
The accent in the letter “α” breaks the search.
I tried to put the letter with accents as a synonym but no luck.
Anyone with same issues please?
It is very annoying because it breaks search and make it not good for customers
Em um ambiente de alta disponibilidade, como essas tecnologias conseguem fazer a replicação de dados do Lucene? Como eu poderia fazer a replicação dos meus diretórios do Lucene, considerando que hoje não utilizo tais tecnologias.
I am currently sending log data from NLog to ElasticSearch . When I installed ES, I saw that , in ElasticSearch there is modules file which contains a lot x-packs. There is one which is name
x-pack-logstash. I want to understand aim of this
I am searching answer below question, unfortunately I have not find answer in ElasticSearch Documentation page.
What is purpose and usage of Logstash inside ES as module. Is Logstash module triggering automatically when there are incoming NLog logs to ES? Does NLog requires Logstash as a mandatory module? I am using
ES version 6.7.0 open source .
p.s For example, is there way to disable
x-pack-logstash? if yes , what happen then .
Thanks in advance
Sou novo no elastic e surgiu uma duvida, caso eu queira fazer igual o google indexar paginas, vamos supor que seja html puro sem imagem… só que um conteudo grande ex. site wikepidia..
Tenho que tratar os dados e colocar no elastic? se sim tem alguma ferramenta que auxilia nisso?
Caso não precise tratar apenas importar os dados com algum crawler… meu documento dentro do elastic não iria ficar muito grande…
Qualquer luz ajuda, grato…
I want to perform a search operation on ElasticSearch using RelNode and not the traditional sql query method
I tried building RelNode by providing projections and filters but it didn’t work saying no input fields found.
SchemaPlus postschema = con.getRootSchema().getSubSchema("twitter2"); FrameworkConfig postConfig = Frameworks.newConfigBuilder() .defaultSchema(postschema) .build(); RelBuilder postBuilder = RelBuilder.create(postConfig); RelNode relSelect = postBuilder.scan("user").project(testBuilder.scan("user").field("name)).build();
How to provide PROJECTIONS AND FILTER in relNode for elasticSearch and get the resulset accordingly.
I am trying to install and enable elasticsearch. I have successfully done the below command Sudo apt-get install elasticsearch When I started the service of elasticsearch it’s successfully Working. I am stuck when I execute curl localhost:9300 It’s showing some connection refused error. Then I tried to install some additional plugins at /usr/share/elasticsearch> bin/elasticsearch-plugin install pluginName
But it shows me “elasticsearch-plugin” no file or directory found. I have checked but no such file reside in elasticsearch/bin folder.
I have ubuntu 16.04 and Magento 2.3.1 and PHP 7.2