Re: Log statement seems to be not working

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Ok, output of command is pasted below (sorry, but it is really log):

root@LEDE:~# nft --debug all add rule ip ipv4_filter incoming tcp dport
{ssh} log accept
Entering state 0
Reducing stack by rule 1 (line 725):
-> $$ = nterm input (: )
Stack now 0
Entering state 1
Reading a token: --accepting rule at line 269 ("add")
Next token is token "add" (: )
Shifting token "add" (: )
Entering state 17
Reading a token: --accepting rule at line 616 (" ")
--accepting rule at line 246 ("rule")
Next token is token "rule" (: )
Shifting token "rule" (: )
Entering state 10
Reading a token: --accepting rule at line 616 (" ")
--accepting rule at line 380 ("ip")
Next token is token "ip" (: )
Shifting token "ip" (: )
Entering state 32
Reducing stack by rule 237 (line 1771):
   $1 = token "ip" (: )
-> $$ = nterm family_spec_explicit (: )
Stack now 0 1 17 10
Entering state 44
Reducing stack by rule 236 (line 1768):
   $1 = nterm family_spec_explicit (: )
-> $$ = nterm family_spec (: )
Stack now 0 1 17 10
Entering state 43
Reading a token: --accepting rule at line 616 (" ")
--accepting rule at line 587 ("ipv4_filter")
Next token is token "string" (: )
Shifting token "string" (: )
Entering state 50
Reducing stack by rule 230 (line 1744):
   $1 = token "string" (: )
-> $$ = nterm identifier (: )
Stack now 0 1 17 10 43
Entering state 238
Reducing stack by rule 243 (line 1779):
   $1 = nterm family_spec (: )
   $2 = nterm identifier (: )
-> $$ = nterm table_spec (: )
Stack now 0 1 17 10
Entering state 45
Reading a token: --accepting rule at line 616 (" ")
--accepting rule at line 587 ("incoming")
Next token is token "string" (: )
Shifting token "string" (: )
Entering state 50
Reducing stack by rule 230 (line 1744):
   $1 = token "string" (: )
-> $$ = nterm identifier (: )
Stack now 0 1 17 10 45
Entering state 239
Reducing stack by rule 244 (line 1787):
   $1 = nterm table_spec (: )
   $2 = nterm identifier (: )
-> $$ = nterm chain_spec (: )
Stack now 0 1 17 10
Entering state 46
Reading a token: --accepting rule at line 616 (" ")
--accepting rule at line 441 ("tcp")
Next token is token "tcp" (: )
Reducing stack by rule 254 (line 1860):
   $1 = nterm chain_spec (: )
-> $$ = nterm rule_position (: )
Stack now 0 1 17 10
Entering state 54
Next token is token "tcp" (: )
Shifting token "tcp" (: )
Entering state 143
Reading a token: --accepting rule at line 616 (" ")
--accepting rule at line 439 ("dport")
Next token is token "dport" (: )
Shifting token "dport" (: )
Entering state 487
Reducing stack by rule 784 (line 3771):
   $1 = token "dport" (: )
-> $$ = nterm tcp_hdr_field (: )
Stack now 0 1 17 10 54 143
Entering state 493
Reducing stack by rule 780 (line 3755):
   $1 = token "tcp" (: )
   $2 = nterm tcp_hdr_field (: )
-> $$ = nterm tcp_hdr_expr (: )
Stack now 0 1 17 10 54
Entering state 206
Reducing stack by rule 695 (line 3574):
   $1 = nterm tcp_hdr_expr (: )
-> $$ = nterm payload_expr (: )
Stack now 0 1 17 10 54
Entering state 312
Reading a token: --accepting rule at line 616 (" ")
--accepting rule at line 208 ("{")
Next token is token '{' (: )
Reducing stack by rule 451 (line 2716):
   $1 = nterm payload_expr (: )
-> $$ = nterm primary_expr (: )
Stack now 0 1 17 10 54
Entering state 296
Reducing stack by rule 472 (line 2770):
   $1 = nterm primary_expr (: )
-> $$ = nterm shift_expr (: )
Stack now 0 1 17 10 54
Entering state 297
Next token is token '{' (: )
Reducing stack by rule 475 (line 2781):
   $1 = nterm shift_expr (: )
-> $$ = nterm and_expr (: )
Stack now 0 1 17 10 54
Entering state 298
Next token is token '{' (: )
Reducing stack by rule 477 (line 2788):
   $1 = nterm and_expr (: )
-> $$ = nterm exclusive_or_expr (: )
Stack now 0 1 17 10 54
Entering state 299
Next token is token '{' (: )
Reducing stack by rule 479 (line 2795):
   $1 = nterm exclusive_or_expr (: )
-> $$ = nterm inclusive_or_expr (: )
Stack now 0 1 17 10 54
Entering state 300
Next token is token '{' (: )
Reducing stack by rule 481 (line 2802):
   $1 = nterm inclusive_or_expr (: )
-> $$ = nterm basic_expr (: )
Stack now 0 1 17 10 54
Entering state 301
Reducing stack by rule 482 (line 2805):
   $1 = nterm basic_expr (: )
-> $$ = nterm concat_expr (: )
Stack now 0 1 17 10 54
Entering state 302
Next token is token '{' (: )
Reducing stack by rule 491 (line 2859):
   $1 = nterm concat_expr (: )
-> $$ = nterm expr (: )
Stack now 0 1 17 10 54
Entering state 304
Next token is token '{' (: )
Shifting token '{' (: )
Entering state 266
Reading a token: --accepting rule at line 587 ("ssh")
Next token is token "string" (: )
Reducing stack by rule 6 (line 749):
-> $$ = nterm opt_newline (: )
Stack now 0 1 17 10 54 304 266
Entering state 627
Next token is token "string" (: )
Shifting token "string" (: )
Entering state 128
Reducing stack by rule 231 (line 1747):
   $1 = token "string" (: )
-> $$ = nterm string (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 180
Reducing stack by rule 446 (line 2687):
   $1 = nterm string (: )
-> $$ = nterm symbol_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 744
Reducing stack by rule 566 (line 3192):
   $1 = nterm symbol_expr (: )
-> $$ = nterm primary_rhs_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 761
Reducing stack by rule 539 (line 3111):
   $1 = nterm primary_rhs_expr (: )
-> $$ = nterm shift_rhs_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 753
Reading a token: --accepting rule at line 209 ("}")
Next token is token '}' (: )
Reducing stack by rule 542 (line 3122):
   $1 = nterm shift_rhs_expr (: )
-> $$ = nterm and_rhs_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 754
Next token is token '}' (: )
Reducing stack by rule 544 (line 3129):
   $1 = nterm and_rhs_expr (: )
-> $$ = nterm exclusive_or_rhs_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 755
Next token is token '}' (: )
Reducing stack by rule 546 (line 3136):
   $1 = nterm exclusive_or_rhs_expr (: )
-> $$ = nterm inclusive_or_rhs_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 756
Next token is token '}' (: )
Reducing stack by rule 548 (line 3143):
   $1 = nterm inclusive_or_rhs_expr (: )
-> $$ = nterm basic_rhs_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 867
Next token is token '}' (: )
Reducing stack by rule 549 (line 3146):
   $1 = nterm basic_rhs_expr (: )
-> $$ = nterm concat_rhs_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 868
Next token is token '}' (: )
Reducing stack by rule 511 (line 2939):
   $1 = nterm concat_rhs_expr (: )
-> $$ = nterm set_lhs_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 866
Reducing stack by rule 506 (line 2916):
   $1 = nterm set_lhs_expr (: )
-> $$ = nterm set_elem_expr_alloc (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 865
Next token is token '}' (: )
Reducing stack by rule 504 (line 2912):
   $1 = nterm set_elem_expr_alloc (: )
-> $$ = nterm set_elem_expr (: )
Stack now 0 1 17 10 54 304 266 627
Entering state 864
Next token is token '}' (: )
Reducing stack by rule 6 (line 749):
-> $$ = nterm opt_newline (: )
Stack now 0 1 17 10 54 304 266 627 864
Entering state 1000
Reducing stack by rule 499 (line 2888):
   $1 = nterm opt_newline (: )
   $2 = nterm set_elem_expr (: )
   $3 = nterm opt_newline (: )
-> $$ = nterm set_list_member_expr (: )
Stack now 0 1 17 10 54 304 266
Entering state 629
Reducing stack by rule 495 (line 2871):
   $1 = nterm set_list_member_expr (: )
-> $$ = nterm set_list_expr (: )
Stack now 0 1 17 10 54 304 266
Entering state 628
Next token is token '}' (: )
Shifting token '}' (: )
Entering state 870
Reducing stack by rule 494 (line 2864):
   $1 = token '{' (: )
   $2 = nterm set_list_expr (: )
   $3 = token '}' (: )
-> $$ = nterm set_expr (: )
Stack now 0 1 17 10 54 304
Entering state 750
Reducing stack by rule 538 (line 3108):
   $1 = nterm set_expr (: )
-> $$ = nterm rhs_expr (: )
Stack now 0 1 17 10 54 304
Entering state 752
Reducing stack by rule 530 (line 3074):
   $1 = nterm expr (: )
   $2 = nterm rhs_expr (: )
-> $$ = nterm relational_expr (: )
Stack now 0 1 17 10 54
Entering state 306
Reducing stack by rule 443 (line 2663):
   $1 = nterm relational_expr (: )
-> $$ = nterm match_stmt (: )
Stack now 0 1 17 10 54
Entering state 295
Reducing stack by rule 266 (line 1937):
   $1 = nterm match_stmt (: )
-> $$ = nterm stmt (: )
Stack now 0 1 17 10 54
Entering state 270
Reducing stack by rule 263 (line 1923):
   $1 = nterm stmt (: )
-> $$ = nterm stmt_list (: )
Stack now 0 1 17 10 54
Entering state 269
Reading a token: --accepting rule at line 616 (" ")
--accepting rule at line 315 ("log")
Next token is token "log" (: )
Shifting token "log" (: )
Entering state 254
Reducing stack by rule 302 (line 2041):
   $1 = token "log" (: )
-> $$ = nterm log_stmt_alloc (: )
Stack now 0 1 17 10 54 269
Entering state 276
Reading a token: --accepting rule at line 616 (" ")
--accepting rule at line 258 ("accept")
Next token is token "accept" (: )
Reducing stack by rule 300 (line 2037):
   $1 = nterm log_stmt_alloc (: )
-> $$ = nterm log_stmt (: )
Stack now 0 1 17 10 54 269
Entering state 275
Reducing stack by rule 271 (line 1942):
   $1 = nterm log_stmt (: )
-> $$ = nterm stmt (: )
Stack now 0 1 17 10 54 269
Entering state 632
Reducing stack by rule 264 (line 1929):
   $1 = nterm stmt_list (: )
   $2 = nterm stmt (: )
-> $$ = nterm stmt_list (: )
Stack now 0 1 17 10 54
Entering state 269
Next token is token "accept" (: )
Shifting token "accept" (: )
Entering state 243
Reducing stack by rule 587 (line 3283):
   $1 = token "accept" (: )
-> $$ = nterm verdict_expr (: )
Stack now 0 1 17 10 54 269
Entering state 307
Reducing stack by rule 283 (line 1956):
   $1 = nterm verdict_expr (: )
-> $$ = nterm verdict_stmt (: )
Stack now 0 1 17 10 54 269
Entering state 271
Reducing stack by rule 265 (line 1936):
   $1 = nterm verdict_stmt (: )
-> $$ = nterm stmt (: )
Stack now 0 1 17 10 54 269
Entering state 632
Reducing stack by rule 264 (line 1929):
   $1 = nterm stmt_list (: )
   $2 = nterm stmt (: )
-> $$ = nterm stmt_list (: )
Stack now 0 1 17 10 54
Entering state 269
Reading a token: --accepting rule at line 596 ("
")
Next token is token "newline" (: )
Reducing stack by rule 262 (line 1911):
   $1 = nterm stmt_list (: )
-> $$ = nterm rule_alloc (: )
Stack now 0 1 17 10 54
Entering state 268
Next token is token "newline" (: )
Reducing stack by rule 260 (line 1901):
   $1 = nterm rule_alloc (: )
-> $$ = nterm rule (: )
Stack now 0 1 17 10 54
Entering state 317
Reducing stack by rule 32 (line 856):
   $1 = token "rule" (: )
   $2 = nterm rule_position (: )
   $3 = nterm rule (: )
-> $$ = nterm add_cmd (: )
Stack now 0 1 17
Entering state 61
Reducing stack by rule 15 (line 818):
   $1 = token "add" (: )
   $2 = nterm add_cmd (: )
-> $$ = nterm base_cmd (: )
Stack now 0 1
Entering state 41
Next token is token "newline" (: )
Shifting token "newline" (: )
Entering state 4
Reducing stack by rule 3 (line 744):
   $1 = token "newline" (: )
-> $$ = nterm stmt_separator (: )
Stack now 0 1 41
Entering state 237
Reducing stack by rule 12 (line 784):
   $1 = nterm base_cmd (: )
   $2 = nterm stmt_separator (: )
-> $$ = nterm line (: )
Stack now 0 1
Entering state 40
Reducing stack by rule 2 (line 726):
   $1 = nterm input (: )
   $2 = nterm line (: )
Evaluate add
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


update network layer protocol context:
 link layer          : none
 network layer       : ip <-
 transport layer     : none

Evaluate expression
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                 ^^^^^^^^^^^^^^^
tcp dport { $ssh }

Evaluate relational
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                 ^^^^^^^^^^^^^^^
tcp dport { $ssh }

Evaluate payload
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                 ^^^^^^^^^
tcp dport

Evaluate expression
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                 ^^^^^^^^^
meta l4proto tcp

Evaluate relational
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                 ^^^^^^^^^
meta l4proto tcp

Evaluate meta
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                 ^^^^^^^^^
meta l4proto

Evaluate value
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                 ^^^^^^^^^
tcp

update transport layer protocol context:
 link layer          : none
 network layer       : ip
 transport layer     : tcp <-

update transport layer protocol context:
 link layer          : none
 network layer       : ip
 transport layer     : tcp <-

Evaluate set
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                           ^^^^^
{ $ssh }

Evaluate set element
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                            ^^^
$ssh

Evaluate symbol
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                            ^^^
$ssh

Evaluate value
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                            ^^^
ssh

Evaluate log
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                                 ^^^
log

Evaluate verdict
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                                     ^^^^^^
accept

Evaluate verdict
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
                                                     ^^^^^^
accept

-> $$ = nterm input (: )
Stack now 0
Entering state 1
Reading a token: --(end of buffer or a NUL)
--EOF (start condition 0)
Now at end of input.
Shifting token "end of file" (: )
Entering state 2
Stack now 0 1 2
Cleanup: popping token "end of file" (: )
Cleanup: popping nterm input (: )
__set%d ipv4_filter 3 size 1
__set%d ipv4_filter 0
    element 00001600  : 0 [end]
ip ipv4_filter incoming
  [ meta load l4proto => reg 1 ]
  [ cmp eq reg 1 0x00000006 ]
  [ payload load 2b @ transport header + 2 => reg 1 ]
  [ lookup reg 1 set __set%d ]
  [ log ]
  [ immediate reg 0 accept ]

----------------    ------------------
|  0000000020  |    | message length |
| 00016 | R--- |    |  type | flags  |
|  0000000003  |    | sequence number|
|  0000000000  |    |     port ID    |
----------------    ------------------
| 00 00 0a 00  |    |  extra header  |
----------------    ------------------
----------------    ------------------
|  0000000104  |    | message length |
| 02569 | R--- |    |  type | flags  |
|  0000000004  |    | sequence number|
|  0000000000  |    |     port ID    |
----------------    ------------------
| 02 00 00 00  |    |  extra header  |
|00016|--|00001|    |len |flags| type|
| 69 70 76 34  |    |      data      |     i p v 4
| 5f 66 69 6c  |    |      data      |     _ f i l
| 74 65 72 00  |    |      data      |     t e r 
|00012|--|00002|    |len |flags| type|
| 5f 5f 73 65  |    |      data      |     _ _ s e
| 74 25 64 00  |    |      data      |     t % d 
|00008|--|00003|    |len |flags| type|
| 00 00 00 03  |    |      data      |           
|00008|--|00004|    |len |flags| type|
| 00 00 00 0d  |    |      data      |           
|00008|--|00005|    |len |flags| type|
| 00 00 00 02  |    |      data      |           
|00008|--|00010|    |len |flags| type|
| 00 00 00 01  |    |      data      |           
|00012|N-|00009|    |len |flags| type|
|00008|--|00001|    |len |flags| type|
| 00 00 00 01  |    |      data      |           
|00010|--|00013|    |len |flags| type|
| 00 04 02 00  |    |      data      |           
| 00 00 00 00  |    |      data      |           
----------------    ------------------
----------------    ------------------
|  0000000076  |    | message length |
| 02572 | R--- |    |  type | flags  |
|  0000000004  |    | sequence number|
|  0000000000  |    |     port ID    |
----------------    ------------------
| 02 00 00 00  |    |  extra header  |
|00012|--|00002|    |len |flags| type|
| 5f 5f 73 65  |    |      data      |     _ _ s e
| 74 25 64 00  |    |      data      |     t % d 
|00008|--|00004|    |len |flags| type|
| 00 00 00 01  |    |      data      |           
|00016|--|00001|    |len |flags| type|
| 69 70 76 34  |    |      data      |     i p v 4
| 5f 66 69 6c  |    |      data      |     _ f i l
| 74 65 72 00  |    |      data      |     t e r 
|00020|N-|00003|    |len |flags| type|
|00016|N-|00001|    |len |flags| type|
|00012|N-|00001|    |len |flags| type|
|00006|--|00001|    |len |flags| type|
| 00 16 00 00  |    |      data      |           
----------------    ------------------
----------------    ------------------
|  0000000300  |    | message length |
| 02566 | R--- |    |  type | flags  |
|  0000000005  |    | sequence number|
|  0000000000  |    |     port ID    |
----------------    ------------------
| 02 00 00 00  |    |  extra header  |
|00016|--|00001|    |len |flags| type|
| 69 70 76 34  |    |      data      |     i p v 4
| 5f 66 69 6c  |    |      data      |     _ f i l
| 74 65 72 00  |    |      data      |     t e r 
|00013|--|00002|    |len |flags| type|
| 69 6e 63 6f  |    |      data      |     i n c o
| 6d 69 6e 67  |    |      data      |     m i n g
| 00 00 00 00  |    |      data      |           
|00248|N-|00004|    |len |flags| type|
|00036|N-|00001|    |len |flags| type|
|00009|--|00001|    |len |flags| type|
| 6d 65 74 61  |    |      data      |     m e t a
| 00 00 00 00  |    |      data      |           
|00020|N-|00002|    |len |flags| type|
|00008|--|00002|    |len |flags| type|
| 00 00 00 10  |    |      data      |           
|00008|--|00001|    |len |flags| type|
| 00 00 00 01  |    |      data      |           
|00044|N-|00001|    |len |flags| type|
|00008|--|00001|    |len |flags| type|
| 63 6d 70 00  |    |      data      |     c m p 
|00032|N-|00002|    |len |flags| type|
|00008|--|00001|    |len |flags| type|
| 00 00 00 01  |    |      data      |           
|00008|--|00002|    |len |flags| type|
| 00 00 00 00  |    |      data      |           
|00012|N-|00003|    |len |flags| type|
|00005|--|00001|    |len |flags| type|
| 06 00 00 00  |    |      data      |           
|00052|N-|00001|    |len |flags| type|
|00012|--|00001|    |len |flags| type|
| 70 61 79 6c  |    |      data      |     p a y l
| 6f 61 64 00  |    |      data      |     o a d 
|00036|N-|00002|    |len |flags| type|
|00008|--|00001|    |len |flags| type|
| 00 00 00 01  |    |      data      |           
|00008|--|00002|    |len |flags| type|
| 00 00 00 02  |    |      data      |           
|00008|--|00003|    |len |flags| type|
| 00 00 00 02  |    |      data      |           
|00008|--|00004|    |len |flags| type|
| 00 00 00 02  |    |      data      |           
|00048|N-|00001|    |len |flags| type|
|00011|--|00001|    |len |flags| type|
| 6c 6f 6f 6b  |    |      data      |     l o o k
| 75 70 00 00  |    |      data      |     u p   
|00032|N-|00002|    |len |flags| type|
|00008|--|00002|    |len |flags| type|
| 00 00 00 01  |    |      data      |           
|00012|--|00001|    |len |flags| type|
| 5f 5f 73 65  |    |      data      |     _ _ s e
| 74 25 64 00  |    |      data      |     t % d 
|00008|--|00004|    |len |flags| type|
| 00 00 00 01  |    |      data      |           
|00016|N-|00001|    |len |flags| type|
|00008|--|00001|    |len |flags| type|
| 6c 6f 67 00  |    |      data      |     l o g 
|00004|N-|00002|    |len |flags| type|
|00048|N-|00001|    |len |flags| type|
|00014|--|00001|    |len |flags| type|
| 69 6d 6d 65  |    |      data      |     i m m e
| 64 69 61 74  |    |      data      |     d i a t
| 65 00 00 00  |    |      data      |     e     
|00028|N-|00002|    |len |flags| type|
|00008|--|00001|    |len |flags| type|
| 00 00 00 00  |    |      data      |           
|00016|N-|00002|    |len |flags| type|
|00012|N-|00002|    |len |flags| type|
|00008|--|00001|    |len |flags| type|
| 00 00 00 01  |    |      data      |           
----------------    ------------------
----------------    ------------------
|  0000000020  |    | message length |
| 00017 | R--- |    |  type | flags  |
|  0000000006  |    | sequence number|
|  0000000000  |    |     port ID    |
----------------    ------------------
| 00 00 0a 00  |    |  extra header  |
----------------    ------------------
Error: Could not process rule: No such file or directory
add rule ip ipv4_filter incoming tcp dport {ssh} log accept
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

--
To unsubscribe from this list: send the line "unsubscribe netfilter" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html



[Index of Archives]     [Linux Netfilter Development]     [Linux Kernel Networking Development]     [Netem]     [Berkeley Packet Filter]     [Linux Kernel Development]     [Advanced Routing & Traffice Control]     [Bugtraq]

  Powered by Linux