Logstash not working with SG - no permissions for indices:data/write/bulk

Hi - anyone know what could be going wrong here - I assigned my logstash user sg_all_access, but in my logstash logs I am getting - no permissions for indices:data/write/bulk
Here is my sg_all_access config which is working fine for my kibana user:

1 sg_all_access:

2 cluster:

3 - CLUSTER_ALL

4 indices:

5 ‘*’:

6 ‘*’:

7 - ALL

Here is my logstash config:

1 output {

2

3 elasticsearch {

4 user => logstash

5 password => xxxxxxxxxxxx

6 ssl => true

7 ssl_certificate_verification => true

8 truststore => “/etc/elasticsearch/truststore.jks”

9 truststore_password => ‘xxxxxxxxxxx’

10 hosts => [“mydatanode-a.com:9200”, “mydatanode-c.com:9200”, “mydatanode-d.com:9200”, “mydatanode-e.com:9200”]

11 index => “qa-logs-%{+YYYY.MM.dd}”

And here is the error I am getting:

{:timestamp=>“2016-11-10T06:40:46.961000+0000”, :message=>“[403] {"error":{"root_cause":[{"type":"security_exception","reason":"no permissions for indices:data/write/bulk"}],"t ype":"security_exception","reason":"no permissions for indices:data/write/bulk"},"status":403}”, :class=>“Elasticsearch::Transport::Transport::Errors::Forbidden”, :backtrace=>[“/op t/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in __raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib /elasticsearch/transport/transport/http/manticore.rb:67:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client. rb:128:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in bulk'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in non_threadsafe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-e lasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:in synchronize’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/ logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-jav a/lib/logstash/outputs/elasticsearch/common.rb:172:in safe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/c ommon.rb:101:in submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:in retrying_submit’”, “/opt/ logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in multi_receive'", "org/jruby/RubyArray.java:1653:in each_s lice’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in multi_receive'", "/opt/logstash/vendor/bundl e/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in worker_multi_receive’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/ output_delegator.rb:114:in multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “org/jruby/RubyHash.java:1 342:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-co re-2.3.3-java/lib/logstash/pipeline.rb:232:in worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in start_workers’”], :level= >:warn}

Any ideas? - Thank you!

Whats your Search Guard version?

···

Am 10.11.2016 um 14:45 schrieb Ross Heilman <rheilman@archinfra.mygbiz.com>:

Hi - anyone know what could be going wrong here - I assigned my logstash user sg_all_access, but in my logstash logs I am getting - no permissions for indices:data/write/bulk
Here is my sg_all_access config which is working fine for my kibana user:
1 sg_all_access:
2 cluster:
3 - CLUSTER_ALL
4 indices:
5 '*':
6 '*':
7 - ALL

Here is my logstash config:
  1 output {
  2
  3 elasticsearch {
  4 user => logstash
  5 password => xxxxxxxxxxxx
  6 ssl => true
  7 ssl_certificate_verification => true
  8 truststore => "/etc/elasticsearch/truststore.jks"
  9 truststore_password => 'xxxxxxxxxxx'
10 hosts => ["mydatanode-a.com:9200", "mydatanode-c.com:9200", "mydatanode-d.com:9200", "mydatanode-e.com:9200"]
11 index => "qa-logs-%{+YYYY.MM.dd}"

And here is the error I am getting:

{:timestamp=>"2016-11-10T06:40:46.961000+0000", :message=>"[403] {\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"no permissions for indices:data/write/bulk\"}],\"t ype\":\"security_exception\",\"reason\":\"no permissions for indices:data/write/bulk\"},\"status\":403}", :class=>"Elasticsearch::Transport::Transport::Errors::Forbidden", :backtrace=>["/op t/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in `__raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib /elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client. rb:128:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in `non_threadsafe_bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-e lasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/ logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-jav a/lib/logstash/outputs/elasticsearch/common.rb:172:in `safe_bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/c ommon.rb:101:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:in `retrying_submit'", "/opt/ logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in `multi_receive'", "org/jruby/RubyArray.java:1653:in `each_s lice'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in `multi_receive'", "/opt/logstash/vendor/bundl e/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in `worker_multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/ output_delegator.rb:114:in `multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in `output_batch'", "org/jruby/RubyHash.java:1 342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in `output_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-co re-2.3.3-java/lib/logstash/pipeline.rb:232:in `worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in `start_workers'"], :level= >:warn}

Any ideas? - Thank you!

--
You received this message because you are subscribed to the Google Groups "Search Guard" group.
To unsubscribe from this group and stop receiving emails from it, send an email to search-guard+unsubscribe@googlegroups.com.
To post to this group, send email to search-guard@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/search-guard/eb9311ae-c746-43c9-9fa5-b9022a506455%40googlegroups.com\.
For more options, visit https://groups.google.com/d/optout\.

Yes, thank you.

Here is the version we installed-

/usr/share/elasticsearch/bin/plugin install -b com.floragunn/search-guard-2/2.3.3.7

Here is another error I am seeing when we changed
permissions to:

sg_logstash:

cluster:

···

indices:admin/template/get

indices:admin/template/put

indices:

‘*’:

‘*’:

indices:data/write/bulk

indices:data/write/bulk[s]

indices:data/write/delete

indices:data/write/update

  • indices:data/read/search

indices:data/read/scroll

CREATE_INDEX

We are now seeing this error, and the odd thing is the
user=>nil password=>nil at the bottom of the error - like it is not
getting passed. Any thoughts?

324 {:timestamp=>“2016-11-10T19:06:58.506000+0000”,
:message=>“Attempted to send a bulk request to Elasticsearch configured
at '["[https://es-m-qa-a.sapphirepri.com:9200/\](https://es-m-qa-a.sapphirepri.com:9200/)”]‘,
but an error occurred and it failed!
Are you sure you can reach elasticsearch from this machine using the
configuration provided?“, :error_message=>”[403]
{"error":{"root_cause":[{"type":"security_exception","reason":"no
permissions for
indices:data/write/bulk"}],"type":"security_exception","reason":"no
permissions for indices:data/write/bulk"},"status":403}“,
:error_class=>“Elasticsearch::Transport::Transport::Errors::Forbidd en”,
:backtrace=>[”/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in
__raise_transport_error'", "/opt/logstash/vendor/bundle/jruby /1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in perform_request’“, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/t ransport/transport/http/manticore.rb:67:in
perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client.rb:128:in perform_request’”, “/opt
/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in
bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/lo gstash/outputs/elasticsearch/http_client.rb:53:in non_threadsafe_bulk’”,
“/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb :38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:in synchronize’”,
“/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:3 8:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:172:in safe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems
/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:101:in
submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outpu ts/elasticsearch/common.rb:86:in retrying_submit’”,
“/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in
multi_receive'", " org/jruby/RubyArray.java:1653:in each_slice’”,
“/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in
multi_receive'", "/opt/ logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in worker_multi_receive’”,
“/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstas h/output_delegator.rb:114:in
multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “org/jruby/RubyHash.java:1342:in each'", " /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline. rb:232:in worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in start_workers’”], :client_config=>{:hosts=>[“https://es-m-qa-a.sapphirepri.com:9 200/”],
:ssl=>{:enabled=>true, :truststore_password=>“63GpQWZ8wVvq”,
:truststore=>”/etc/elasticsearch/truststore.jks"},
:transport_options=>{:socket_timeout=>0, :request_timeout=>0,
:proxy=>nil, :ssl=>{:enabl
ed=>true, :truststore_password=>“63GpQWZ8wVvq”,
:truststore=>“/etc/elasticsearch/truststore.jks”}},
:transport_class=>Elasticsearch::Transport::Transport::HTTP::Manticore,
:headers=>{“Authorization”=>“Basic b G9nc3Rhc2g6c21keTN6NTluNWhr”},
:logger=>nil, :tracer=>nil, :reload_connections=>false,
:retry_on_failure=>false, :reload_on_failure=>false,
:randomize_hosts=>false, :http=>{:scheme=>“https”,
:user=>nil, :pass word=>nil,
:port=>9200}}, :level=>:error}

On Thursday, November 10, 2016 at 10:19:47 AM UTC-5, SG wrote:

Whats your Search Guard version?

Am 10.11.2016 um 14:45 schrieb Ross Heilman rhei...@archinfra.mygbiz.com:

Hi - anyone know what could be going wrong here - I assigned my logstash user sg_all_access, but in my logstash logs I am getting - no permissions for indices:data/write/bulk

Here is my sg_all_access config which is working fine for my kibana user:

1 sg_all_access:

2 cluster:

3 - CLUSTER_ALL

4 indices:

5 ‘*’:

6 ‘*’:

7 - ALL

Here is my logstash config:

1 output {

2

3 elasticsearch {

4 user => logstash

5 password => xxxxxxxxxxxx

6 ssl => true

7 ssl_certificate_verification => true

8 truststore => “/etc/elasticsearch/truststore.jks”

9 truststore_password => ‘xxxxxxxxxxx’

10 hosts => [“mydatanode-a.com:9200”, “mydatanode-c.com:9200”, “mydatanode-d.com:9200”, “mydatanode-e.com:9200”]

11 index => “qa-logs-%{+YYYY.MM.dd}”

And here is the error I am getting:

{:timestamp=>“2016-11-10T06:40:46.961000+0000”, :message=>“[403] {"error":{"root_cause":[{"type":"security_exception","reason":"no permissions for indices:data/write/bulk"}],"t ype":"security_exception","reason":"no permissions for indices:data/write/bulk"},"status":403}”, :class=>“Elasticsearch::Transport::Transport::Errors::Forbidden”, :backtrace=>[“/op t/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in __raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib /elasticsearch/transport/transport/http/manticore.rb:67:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client. rb:128:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in bulk'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in non_threadsafe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-e lasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:in synchronize’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/ logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-jav a/lib/logstash/outputs/elasticsearch/common.rb:172:in safe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/c ommon.rb:101:in submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:in retrying_submit’”, “/opt/ logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in multi_receive'", "org/jruby/RubyArray.java:1653:in each_s lice’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in multi_receive'", "/opt/logstash/vendor/bundl e/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in worker_multi_receive’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/ output_delegator.rb:114:in multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “org/jruby/RubyHash.java:1 342:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-co re-2.3.3-java/lib/logstash/pipeline.rb:232:in worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in start_workers’”], :level= >:warn}

Any ideas? - Thank you!


You received this message because you are subscribed to the Google Groups “Search Guard” group.

To unsubscribe from this group and stop receiving emails from it, send an email to search-guard...@googlegroups.com.

To post to this group, send email to search...@googlegroups.com.

To view this discussion on the web visit https://groups.google.com/d/msgid/search-guard/eb9311ae-c746-43c9-9fa5-b9022a506455%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

So we narrowed down the issue to one specific config - This is the index name we use in our Logstash config (and we need it to be this) index => “qa-logs-%{loggerName}-%{+YYYY.MM.dd}”
Problem is Search-Guard gives us the permission errors when that is out index name. If we change it to something like - index => “qa-logs-%{+YYYY.MM.dd}” Logstash with Search-Guard works fine.

Really stuck now - any help would be greatly appreciated!

Thanks!

-Ross

···

On Thursday, November 10, 2016 at 8:45:06 AM UTC-5, Ross Heilman wrote:

Hi - anyone know what could be going wrong here - I assigned my logstash user sg_all_access, but in my logstash logs I am getting - no permissions for indices:data/write/bulk
Here is my sg_all_access config which is working fine for my kibana user:

1 sg_all_access:

2 cluster:

3 - CLUSTER_ALL

4 indices:

5 ‘*’:

6 ‘*’:

7 - ALL

Here is my logstash config:

1 output {

2

3 elasticsearch {

4 user => logstash

5 password => xxxxxxxxxxxx

6 ssl => true

7 ssl_certificate_verification => true

8 truststore => “/etc/elasticsearch/truststore.jks”

9 truststore_password => ‘xxxxxxxxxxx’

10 hosts => [“mydatanode-a.com:9200”, “mydatanode-c.com:9200”, “mydatanode-d.com:9200”, “mydatanode-e.com:9200”]

11 index => “qa-logs-%{+YYYY.MM.dd}”

And here is the error I am getting:

{:timestamp=>“2016-11-10T06:40:46.961000+0000”, :message=>“[403] {"error":{"root_cause":[{"type":"security_exception","reason":"no permissions for indices:data/write/bulk"}],"t ype":"security_exception","reason":"no permissions for indices:data/write/bulk"},"status":403}”, :class=>“Elasticsearch::Transport::Transport::Errors::Forbidden”, :backtrace=>[“/op t/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in __raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib /elasticsearch/transport/transport/http/manticore.rb:67:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client. rb:128:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in bulk'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in non_threadsafe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-e lasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:in synchronize’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/ logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-jav a/lib/logstash/outputs/elasticsearch/common.rb:172:in safe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/c ommon.rb:101:in submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:in retrying_submit’”, “/opt/ logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in multi_receive'", "org/jruby/RubyArray.java:1653:in each_s lice’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in multi_receive'", "/opt/logstash/vendor/bundl e/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in worker_multi_receive’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/ output_delegator.rb:114:in multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “org/jruby/RubyHash.java:1 342:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-co re-2.3.3-java/lib/logstash/pipeline.rb:232:in worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in start_workers’”], :level= >:warn}

Any ideas? - Thank you!

Hi Ross,

thanks for reporting this, we’re trying to reproduce it. Could you please help us by sending the Elasticsearch logfiles on DEBUG level when the error occurs?

Add this line to the logging.yml file in the config directory:

com.floragunn: DEBUG

Output will be verbose, but helps us to pinpoint the problem. If possible, just send the complete logfile from startup to the point where the “no perm match” error occurs.

Thanks!

Jochen

CTO

floragunn GmbH

···

Am Donnerstag, 10. November 2016 22:14:10 UTC+2 schrieb Ross Heilman:

Problem is Search-Guard gives us the permission errors when that is out index name. If we change it to something like - index => “qa-logs-%{+YYYY.MM.dd}” Logstash with Search-Guard works fine.

Really stuck now - any help would be greatly appreciated!

Thanks!

-Ross

On Thursday, November 10, 2016 at 8:45:06 AM UTC-5, Ross Heilman wrote:

Hi - anyone know what could be going wrong here - I assigned my logstash user sg_all_access, but in my logstash logs I am getting - no permissions for indices:data/write/bulk
Here is my sg_all_access config which is working fine for my kibana user:

1 sg_all_access:

2 cluster:

3 - CLUSTER_ALL

4 indices:

5 ‘*’:

6 ‘*’:

7 - ALL

Here is my logstash config:

1 output {

2

3 elasticsearch {

4 user => logstash

5 password => xxxxxxxxxxxx

6 ssl => true

7 ssl_certificate_verification => true

8 truststore => “/etc/elasticsearch/truststore.jks”

9 truststore_password => ‘xxxxxxxxxxx’

10 hosts => [“mydatanode-a.com:9200”, “mydatanode-c.com:9200”, “mydatanode-d.com:9200”, “mydatanode-e.com:9200”]

11 index => “qa-logs-%{+YYYY.MM.dd}”

And here is the error I am getting:

{:timestamp=>“2016-11-10T06:40:46.961000+0000”, :message=>“[403] {"error":{"root_cause":[{"type":"security_exception","reason":"no permissions for indices:data/write/bulk"}],"t ype":"security_exception","reason":"no permissions for indices:data/write/bulk"},"status":403}”, :class=>“Elasticsearch::Transport::Transport::Errors::Forbidden”, :backtrace=>[“/op t/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in __raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib /elasticsearch/transport/transport/http/manticore.rb:67:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client. rb:128:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in bulk'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in non_threadsafe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-e lasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:in synchronize’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/ logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-jav a/lib/logstash/outputs/elasticsearch/common.rb:172:in safe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/c ommon.rb:101:in submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:in retrying_submit’”, “/opt/ logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in multi_receive'", "org/jruby/RubyArray.java:1653:in each_s lice’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in multi_receive'", "/opt/logstash/vendor/bundl e/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in worker_multi_receive’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/ output_delegator.rb:114:in multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “org/jruby/RubyHash.java:1 342:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-co re-2.3.3-java/lib/logstash/pipeline.rb:232:in worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in start_workers’”], :level= >:warn}

Any ideas? - Thank you!

So we narrowed down the issue to one specific config - This is the index name we use in our Logstash config (and we need it to be this) index => “qa-logs-%{loggerName}-%{+YYYY.MM.dd}”

Also, are you sure that this part

%{loggerName}

gets expanded correctly? Because if it does and the outcome is a valid index name, and the logstash user is in group all_access, and this group has all permissions for all indices, there’s no reason it should not work.

Anyways, we need to have a look at the ES debug logs to see what’s going on.

···

Am Donnerstag, 10. November 2016 22:34:42 UTC+2 schrieb Jochen Kressin:

Hi Ross,

thanks for reporting this, we’re trying to reproduce it. Could you please help us by sending the Elasticsearch logfiles on DEBUG level when the error occurs?

Add this line to the logging.yml file in the config directory:

com.floragunn: DEBUG

Output will be verbose, but helps us to pinpoint the problem. If possible, just send the complete logfile from startup to the point where the “no perm match” error occurs.

Thanks!

Jochen

CTO

floragunn GmbH

Am Donnerstag, 10. November 2016 22:14:10 UTC+2 schrieb Ross Heilman:

Problem is Search-Guard gives us the permission errors when that is out index name. If we change it to something like - index => “qa-logs-%{+YYYY.MM.dd}” Logstash with Search-Guard works fine.

Really stuck now - any help would be greatly appreciated!

Thanks!

-Ross

On Thursday, November 10, 2016 at 8:45:06 AM UTC-5, Ross Heilman wrote:

Hi - anyone know what could be going wrong here - I assigned my logstash user sg_all_access, but in my logstash logs I am getting - no permissions for indices:data/write/bulk
Here is my sg_all_access config which is working fine for my kibana user:

1 sg_all_access:

2 cluster:

3 - CLUSTER_ALL

4 indices:

5 ‘*’:

6 ‘*’:

7 - ALL

Here is my logstash config:

1 output {

2

3 elasticsearch {

4 user => logstash

5 password => xxxxxxxxxxxx

6 ssl => true

7 ssl_certificate_verification => true

8 truststore => “/etc/elasticsearch/truststore.jks”

9 truststore_password => ‘xxxxxxxxxxx’

10 hosts => [“mydatanode-a.com:9200”, “mydatanode-c.com:9200”, “mydatanode-d.com:9200”, “mydatanode-e.com:9200”]

11 index => “qa-logs-%{+YYYY.MM.dd}”

And here is the error I am getting:

{:timestamp=>“2016-11-10T06:40:46.961000+0000”, :message=>“[403] {"error":{"root_cause":[{"type":"security_exception","reason":"no permissions for indices:data/write/bulk"}],"t ype":"security_exception","reason":"no permissions for indices:data/write/bulk"},"status":403}”, :class=>“Elasticsearch::Transport::Transport::Errors::Forbidden”, :backtrace=>[“/op t/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in __raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib /elasticsearch/transport/transport/http/manticore.rb:67:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client. rb:128:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in bulk'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in non_threadsafe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-e lasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:in synchronize’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/ logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-jav a/lib/logstash/outputs/elasticsearch/common.rb:172:in safe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/c ommon.rb:101:in submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:in retrying_submit’”, “/opt/ logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in multi_receive'", "org/jruby/RubyArray.java:1653:in each_s lice’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in multi_receive'", "/opt/logstash/vendor/bundl e/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in worker_multi_receive’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/ output_delegator.rb:114:in multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “org/jruby/RubyHash.java:1 342:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-co re-2.3.3-java/lib/logstash/pipeline.rb:232:in worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in start_workers’”], :level= >:warn}

Any ideas? - Thank you!

So we narrowed down the issue to one specific config - This is the index name we use in our Logstash config (and we need it to be this) index => “qa-logs-%{loggerName}-%{+YYYY.MM.dd}”

Thank you! Yes, that is the weird thing, it seems as though it does get expanded out correctly when SG is not enabled, but it does not when it is enabled. I will post another error that shows %{type} not getting expanded out. Not sure search-guard has anything to do with it, but just a very odd issue. I had to role back the change at this moment. I need to rebuild it in a test environment and reproduce. Once I do that, I will send you guys the debug logs.

{:timestamp=>“2016-11-10T15:20:57.470000+0000”, :message=>“[400] {"error":{"root_cause":[{"type":"pattern_syntax_exception","reason":"Illegal repetition near index 8\nqa-logs-%{type}_2016\\.11\ \.08.\n ^"}],"type":"pattern_syntax_exception","reason":"Illegal repetition near index 8\nqa-logs-%{type}_2016\\.11\\.08.\n ^"},"status":400}”,

Thank you, and I’ll get you more information as soon as I can,

  • Ross
···

On Fri, Nov 11, 2016 at 3:58 AM, Jochen Kressin jkressin@floragunn.com wrote:

Also, are you sure that this part

%{loggerName}

gets expanded correctly? Because if it does and the outcome is a valid index name, and the logstash user is in group all_access, and this group has all permissions for all indices, there’s no reason it should not work.

Anyways, we need to have a look at the ES debug logs to see what’s going on.

Am Donnerstag, 10. November 2016 22:34:42 UTC+2 schrieb Jochen Kressin:

Hi Ross,

thanks for reporting this, we’re trying to reproduce it. Could you please help us by sending the Elasticsearch logfiles on DEBUG level when the error occurs?

Add this line to the logging.yml file in the config directory:

com.floragunn: DEBUG

Output will be verbose, but helps us to pinpoint the problem. If possible, just send the complete logfile from startup to the point where the “no perm match” error occurs.

Thanks!

Jochen

CTO

floragunn GmbH

Am Donnerstag, 10. November 2016 22:14:10 UTC+2 schrieb Ross Heilman:

Problem is Search-Guard gives us the permission errors when that is out index name. If we change it to something like - index => “qa-logs-%{+YYYY.MM.dd}” Logstash with Search-Guard works fine.

Really stuck now - any help would be greatly appreciated!

Thanks!

-Ross

On Thursday, November 10, 2016 at 8:45:06 AM UTC-5, Ross Heilman wrote:

Hi - anyone know what could be going wrong here - I assigned my logstash user sg_all_access, but in my logstash logs I am getting - no permissions for indices:data/write/bulk
Here is my sg_all_access config which is working fine for my kibana user:

1 sg_all_access:

2 cluster:

3 - CLUSTER_ALL

4 indices:

5 ‘*’:

6 ‘*’:

7 - ALL

Here is my logstash config:

1 output {

2

3 elasticsearch {

4 user => logstash

5 password => xxxxxxxxxxxx

6 ssl => true

7 ssl_certificate_verification => true

8 truststore => “/etc/elasticsearch/truststore.jks”

9 truststore_password => ‘xxxxxxxxxxx’

10 hosts => [“mydatanode-a.com:9200”, “mydatanode-c.com:9200”, “mydatanode-d.com:9200”, “mydatanode-e.com:9200”]

11 index => “qa-logs-%{+YYYY.MM.dd}”

And here is the error I am getting:

{:timestamp=>“2016-11-10T06:40:46.961000+0000”, :message=>“[403] {"error":{"root_cause":[{"type":"security_exception","reason":"no permissions for indices:data/write/bulk"}],"t ype":"security_exception","reason":"no permissions for indices:data/write/bulk"},"status":403}”, :class=>“Elasticsearch::Transport::Transport::Errors::Forbidden”, :backtrace=>[“/op t/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in __raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib /elasticsearch/transport/transport/http/manticore.rb:67:in perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client. rb:128:in perform_request’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in bulk'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in non_threadsafe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-e lasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:in synchronize’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/ logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-jav a/lib/logstash/outputs/elasticsearch/common.rb:172:in safe_bulk’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/c ommon.rb:101:in submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:in retrying_submit’”, “/opt/ logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in multi_receive'", "org/jruby/RubyArray.java:1653:in each_s lice’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in multi_receive'", "/opt/logstash/vendor/bundl e/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in worker_multi_receive’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/ output_delegator.rb:114:in multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “org/jruby/RubyHash.java:1 342:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in output_batch’”, “/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-co re-2.3.3-java/lib/logstash/pipeline.rb:232:in worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in start_workers’”], :level= >:warn}

Any ideas? - Thank you!

So we narrowed down the issue to one specific config - This is the index name we use in our Logstash config (and we need it to be this) index => “qa-logs-%{loggerName}-%{+YYYY.MM.dd}”

You received this message because you are subscribed to a topic in the Google Groups “Search Guard” group.

To unsubscribe from this topic, visit https://groups.google.com/d/topic/search-guard/vMOPYG2TUmY/unsubscribe.

To unsubscribe from this group and all its topics, send an email to search-guard+unsubscribe@googlegroups.com.

To post to this group, send email to search-guard@googlegroups.com.

To view this discussion on the web visit https://groups.google.com/d/msgid/search-guard/dca0c613-74a0-4b63-8b6f-160749f1edad%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.