Secure elasticsearch in Docker

How can I secure elasticsearch for production use in Docker?

I use this docker-compose.yml:

version: '2' services: elasticsearch: image: docker.elastic.co/elasticsearch/elasticsearch:5.6.16 container_name: elasticsearch restart: unless-stopped environment: - "network.host=0.0.0.0" - "http.port=9200" - "cluster.name=elasticsearch" - "node.name=db-master" - "node.master=true" - "node.data=true" - "bootstrap.memory_lock=true" - "ES_JAVA_OPTS=-Xms6g -Xmx6g" - xpack.security.enabled=false ulimits: memlock: soft: -1 hard: -1 mem_limit: 12g volumes: - esdata:/usr/share/elasticsearch/data ports: - 127.0.0.1:9200:9200 networks: - esnet volumes: esdata: driver: local networks: esnet: 

I want elasticsearch to be accessible only on localhost network (only local apps should access it), so it shouldn’t be accessible from internet. I use bind to localhost - 127.0.0.1:9200:9200, but I don’t know if it is enough.

Writing to ElasticSearch

Could someone kindly direct me to suitable documentation, Wolfram or otherwise, to connect Mathematica to an ElasticSearch instance. I don’t need anything complicated just sufficient to write a record of two numbers and a space separated string which will need tokenising. An improvement would be if I could write out a list of such, around 100mb in size, in one call to ElasticSearch.

TIA

Magento 2.2.2 catalogsearch_fulltext Reindex + Elasticsearch

Dedicate Server 16GB RAM 4 Core Processor Centos Magento 2.2.2 Elastiscearch 6.4

I am trying to run reindex from command line ssh putty php bin/magento indexer:reindex catalogsearch_fulltext

I have elasticsearch 6.4 installed and setup in admin and Test Shows Successful

However when I go to run the reindex it takes many hours and eventually the ssh temrminal disconnects and crashes the whole server and all sites on the server. Only way I can access is by rebooting the server. Otherwise it shows nginx 504 gateway timeout for all sites and also WHM and CPANEL

I managed to work through some of the errors showing previously in elasticsearch log but nothing seems to fix this problem and Ive spent 3 weeks on this.

Magento 2.3.2 Elasticsearch issue with multi language(Greek)

I am using Elasticsearch 6.8.1 with magento 2.3.2 instance. Although in English store my search is working fine, in Greek store I am having some issues with the words having accents. This happens only if Elasticsearch is used. With mysql everything is fine

Example:

room in Greek is “δωμάτιο”

notice “ά” letter has an accent

If you search:

“δωμ” you get right results including names with “δωμάτιο”(room)

But if you search:

“δωμάτιο” or “δωμάτιο” or “δώμα”

You don’t get any result with this word at all.

The accent in the letter “α” breaks the search.

I tried to put the letter with accents as a synonym but no luck.

Anyone with same issues please?

It is very annoying because it breaks search and make it not good for customers

What is purpose x-pack-logstash inside ElasticSearch?

I am currently sending log data from NLog to ElasticSearch . When I installed ES, I saw that , in ElasticSearch there is modules file which contains a lot x-packs. There is one which is name x-pack-logstash. I want to understand aim of this x-pack-logstash.

I am searching answer below question, unfortunately I have not find answer in ElasticSearch Documentation page.

What is purpose and usage of Logstash inside ES as module. Is Logstash module triggering automatically when there are incoming NLog logs to ES? Does NLog requires Logstash as a mandatory module? I am using ES version 6.7.0 open source .

p.s For example, is there way to disable x-pack-logstash? if yes , what happen then .

Thanks in advance

import documentos grande no elasticsearch? [pendente]

Sou novo no elastic e surgiu uma duvida, caso eu queira fazer igual o google indexar paginas, vamos supor que seja html puro sem imagem… só que um conteudo grande ex. site wikepidia..

Tenho que tratar os dados e colocar no elastic? se sim tem alguma ferramenta que auxilia nisso?

Caso não precise tratar apenas importar os dados com algum crawler… meu documento dentro do elastic não iria ficar muito grande…

Qualquer luz ajuda, grato…

APACHE CALCITE | perform search operation on elasticSearch using RelNode

I want to perform a search operation on ElasticSearch using RelNode and not the traditional sql query method

I tried building RelNode by providing projections and filters but it didn’t work saying no input fields found.

   SchemaPlus postschema = con.getRootSchema().getSubSchema("twitter2");      FrameworkConfig postConfig = Frameworks.newConfigBuilder()             .defaultSchema(postschema)             .build();     RelBuilder postBuilder = RelBuilder.create(postConfig);     RelNode relSelect = postBuilder.scan("user").project(testBuilder.scan("user").field("name)).build(); 

How to provide PROJECTIONS AND FILTER in relNode for elasticSearch and get the resulset accordingly.

Issues with elasticsearch installation Magento 2.3.1

I am trying to install and enable elasticsearch. I have successfully done the below command Sudo apt-get install elasticsearch When I started the service of elasticsearch it’s successfully Working. I am stuck when I execute curl localhost:9300 It’s showing some connection refused error. Then I tried to install some additional plugins at /usr/share/elasticsearch> bin/elasticsearch-plugin install pluginName

But it shows me “elasticsearch-plugin” no file or directory found. I have checked but no such file reside in elasticsearch/bin folder.

I have ubuntu 16.04 and Magento 2.3.1 and PHP 7.2

invalid literal for int() with base 10 ElasticSearch

I have a problem with elasticsearch.

I cannot write string values from pandas df to elasticsearch. I’ getting an error:

ValueError: invalid literal for int() with base 10: 'Marc34' 

Elasticsearch index:

request_body = {         "settings" : {             "number_of_shards": 6,             "number_of_replicas": 1         },          'mappings': {             'doc': {                 'properties': {                     'name_and_id': {'index': 'not_analyzed', 'type': 'string'},                     'surname_and_id': {'index': 'not_analyzed', 'type': 'string'},                     'match': {'index': 'not_analyzed', 'type': 'integer'}                 }}}     } print("creating 'info_main' index...") res = es.index(index="info_main",body=request_body, doc_type='info_m') 

How I am saving to ES:

def chunk_bulk(d):      bulk_data = []      for index, row in d.iterrows():         data_dict = {}         for i in range(len(row)):             data_dict[d.columns[i]] = int(row[i])         op_dict = {             "index": {                 "_index": 'info_main',                 "_type": 'info_m',         }         }         bulk_data.append(op_dict)         bulk_data.append(data_dict)     res = es.bulk(index = 'info_main', body = bulk_data)     return res  sum_ = 0 i = 0 while i <= len(data):     j = i + 9000     d = data[i:j]     i = j     sum_ = sum_ + len(d)     chunk_bulk(d) print(sum_) 

Does anyone know what can cause the problem? If I use only integers and change mappings to an integer, everything works great.

Any help is welcomed. Thanks in advance.