Merge branch 'dev' into network-vars

dev
Sandeep Singh 2021-11-28 04:42:08 +05:30 committed by GitHub
commit 670f0d6775
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
158 changed files with 7692 additions and 2092 deletions

603
DESIGN.md Normal file
View File

@ -0,0 +1,603 @@
# Nuclei Architecture Document
A brief overview of Nuclei Engine architecture. This document will be kept updated as the engine progresses.
## pkg/templates
### Template
Template is the basic unit of input to the engine which describes the requests to be made, matching to be done, data to extract, etc.
The template structure is described here. Template level attributes are defined here as well as convenience methods to validate, parse and compile templates creating executers.
Any attributes etc required for the template, engine or requests to function are also set here.
Workflows are also compiled, their templates are loaded and compiled as well. Any validations etc on the paths provided are also done here.
`Parse` function is the main entry point which returns a template for a `filePath` and `executorOptions`. It compiles all the requests for the templates, all the workflows, as well as any self-contained request etc. It also caches the templates in an in-memory cache.
### Preprocessors
Preprocessors are also applied here which can do things at template level. They get data of the template which they can alter at will on runtime. This is used in the engine to do random string generation.
Custom processor can be used if they satisfy the following interface.
```go
type Preprocessor interface {
Process(data []byte) []byte
}
```
## pkg/model
Model package implements Information structure for Nuclei Templates. `Info` contains all major metadata information for the template. `Classification` structure can also be used to provide additional context to vulnerability data.
It also specifies a `WorkflowLoader` interface that is used during workflow loading in template compilation stage.
```go
type WorkflowLoader interface {
GetTemplatePathsByTags(tags []string) []string
GetTemplatePaths(templatesList []string, noValidate bool) []string
}
```
## pkg/protocols
Protocols package implements all the request protocols supported by Nuclei. This includes http, dns, network, headless and file requests as of now.
### Request
It exposes a `Request` interface that is implemented by all the request protocols supported.
```go
// Request is an interface implemented any protocol based request generator.
type Request interface {
Compile(options *ExecuterOptions) error
Requests() int
GetID() string
Match(data map[string]interface{}, matcher *matchers.Matcher) (bool, []string)
Extract(data map[string]interface{}, matcher *extractors.Extractor) map[string]struct{}
ExecuteWithResults(input string, dynamicValues, previous output.InternalEvent, callback OutputEventCallback) error
MakeResultEventItem(wrapped *output.InternalWrappedEvent) *output.ResultEvent
MakeResultEvent(wrapped *output.InternalWrappedEvent) []*output.ResultEvent
GetCompiledOperators() []*operators.Operators
}
```
Many of these methods are similar across protocols while some are very protocol specific.
A brief overview of the methods is provided below -
- **Compile** - Compiles the request with provided options.
- **Requests** - Returns total requests made.
- **GetID** - Returns any ID for the request
- **Match** - Used to perform matching for patterns using matchers
- **Extract** - Used to perform extraction for patterns using extractors
- **ExecuteWithResults** - Request execution function for input.
- **MakeResultEventItem** - Creates a single result event for the intermediate `InternalWrappedEvent` output structure.
- **MakeResultEvent** - Returns a slice of results based on an `InternalWrappedEvent` internal output event.
- **GetCompiledOperators** - Returns the compiled operators for the request.
`MakeDefaultResultEvent` function can be used as a default for `MakeResultEvent` function when no protocol-specific features need to be implemented for result generation.
For reference protocol requests implementations, one can look at the below packages -
1. [pkg/protocols/http](./v2/pkg/protocols/http)
2. [pkg/protocols/dns](./v2/pkg/protocols/dns)
3. [pkg/protocols/network](./v2/pkg/protocols/network)
### Executer
All these different requests interfaces are converted to an Executer which is also an interface defined in `pkg/protocols` which is used during final execution of the template.
```go
// Executer is an interface implemented any protocol based request executer.
type Executer interface {
Compile() error
Requests() int
Execute(input string) (bool, error)
ExecuteWithResults(input string, callback OutputEventCallback) error
}
```
The `ExecuteWithResults` function accepts a callback, which gets provided with results during execution in form of `*output.InternalWrappedEvent` structure.
The default executer is provided in `pkg/protocols/common/executer` . It takes a list of Requests and relevant `ExecuterOptions` and implements the Executer interface required for template execution. The executer during Template compilation process is created from this package and used as-is.
A different executer is the Clustered Requests executer which implements the Nuclei Request clustering functionality in `pkg/templates` We have a single HTTP request in cases where multiple templates can be clustered and multiple operator lists to match/extract. The first HTTP request is executed while all the template matcher/extractor are evaluated separately.
For Workflow execution, a separate RunWorkflow function is used which executes the workflow independently from the template execution.
With this basic premise set, we can now start exploring the current runner implementation which will also walk us through the architecture of nuclei.
## internal/runner
### Template loading
The first process after all CLI specific initialisation is the loading of template/workflow paths that the user wants to run. This is done by the packages described below.
#### pkg/catalog
This package is used to get paths using mixed syntax. It takes a template directory and performs resolving for template paths both from provided template directory as well as the current user directory.
The syntax is very versatile and can include filenames, glob patterns, directories, absolute paths, and relative-paths.
Next step is the initialisation of the reporting modules which is handled in `pkg/reporting`.
#### pkg/reporting
Reporting module contains exporters and trackers as well as a module for deduplication and a module for result formatting.
Exporters and Trackers are interfaces defined in pkg/reporting.
```go
// Tracker is an interface implemented by an issue tracker
type Tracker interface {
CreateIssue(event *output.ResultEvent) error
}
// Exporter is an interface implemented by an issue exporter
type Exporter interface {
Close() error
Export(event *output.ResultEvent) error
}
```
Exporters include `Elasticsearch`, `markdown`, `sarif` . Trackers include `GitHub` , `Gitlab` and `Jira`.
Each exporter and trackers implement their own configuration in YAML format and are very modular in nature, so adding new ones is easy.
After reading all the inputs from various sources and initialisation other miscellaneous options, the next bit is the output writing which is done using `pkg/output` module.
#### pkg/output
Output package implements the output writing functionality for Nuclei.
Output Writer implements the Writer interface which is called each time a result is found for nuclei.
```go
// Writer is an interface which writes output to somewhere for nuclei events.
type Writer interface {
Close()
Colorizer() aurora.Aurora
Write(*ResultEvent) error
Request(templateID, url, requestType string, err error)
}
```
ResultEvent structure is passed to the Nuclei Output Writer which contains the entire detail of a found result. Various intermediary types like `InternalWrappedEvent` and `InternalEvent` are used throughout nuclei protocols and matchers to describe results in various stages of execution.
Interactsh is also initialised if it is not explicitly disabled.
#### pkg/protocols/common/interactsh
Interactsh module is used to provide automatic Out of Band vulnerability identification in Nuclei.
It uses two LRU caches, one for storing interactions for request URLs and one for storing requests for interaction URL. These both caches are used to correlated requests received to the Interactsh OOB server and Nuclei Instance. [Interactsh Client](https://github.com/projectdiscovery/interactsh/pkg/client) package does most of the heavy lifting of this module.
Polling for interactions and server registration only starts when a template uses the interactsh module and is executed by nuclei. After that no registration is required for the entire run.
### RunEnumeration
Next we arrive in the `RunEnumeration` function of the runner.
`HostErrorsCache` is initialised which is used throughout the run of Nuclei enumeration to keep track of errors per host and skip further requests if the errors are greater than the provided threshold. The functionality for the error tracking cache is defined in [hosterrorscache.go](https://github.com/projectdiscovery/nuclei/blob/master/v2/pkg/protocols/common/hosterrorscache/hosterrorscache.go) and is pretty simplistic in nature.
Next the `WorkflowLoader` is initialised which used to load workflows. It exists in `v2/pkg/parsers/workflow_loader.go`
The loader is initialised moving forward which is responsible for Using Catalog, Passed Tags, Filters, Paths, etc to return compiled `Templates` and `Workflows`.
#### pkg/catalog/loader
First the input passed by the user as paths is normalised to absolute paths which is done by the `pkg/catalog` module. Next the path filter module is used to removed the excluded template/workflows paths.
`pkg/parsers` module's `LoadTemplate`,`LoadWorkflow` functions are used to check if the templates pass the validation + are not excluded via tags/severity/etc filters. If all checks are passed, then the template/workflow is parsed and returned in a compiled form by the `pkg/templates`'s `Parse` function.
`Parse` function performs compilation of all the requests in a template + creates Executers from them returning a runnable Template/Workflow structure.
Clustering module comes in next whose job is to cluster identical HTTP GET requests together (as a lot of the templates perform the same get requests many times, it's a good way to save many requests on large scans with lots of templates).
### pkg/operators
Operators package implements all of the matching and extracting logic of Nuclei.
```go
// Operators contains the operators that can be applied on protocols
type Operators struct {
Matchers []*matchers.Matcher
Extractors []*extractors.Extractor
MatchersCondition string
}
```
A protocol only needs to embed the `operators.Operators` type shown above and it can utilise all the matching/extracting functionality of nuclei.
```go
// MatchFunc performs matching operation for a matcher on model and returns true or false.
type MatchFunc func(data map[string]interface{}, matcher *matchers.Matcher) (bool, []string)
// ExtractFunc performs extracting operation for an extractor on model and returns true or false.
type ExtractFunc func(data map[string]interface{}, matcher *extractors.Extractor) map[string]struct{}
// Execute executes the operators on data and returns a result structure
func (operators *Operators) Execute(data map[string]interface{}, match MatchFunc, extract ExtractFunc, isDebug bool) (*Result, bool)
```
The core of this process is the Execute function which takes an input dictionary as well as a Match and Extract function and return a `Result` structure which is used later during nuclei execution to check for results.
```go
// Result is a result structure created from operators running on data.
type Result struct {
Matched bool
Extracted bool
Matches map[string][]string
Extracts map[string][]string
OutputExtracts []string
DynamicValues map[string]interface{}
PayloadValues map[string]interface{}
}
```
The internal logics for matching and extracting for things like words, regexes, jq, paths, etc is specified in `pkg/operators/matchers`, `pkg/operators/extractors`. Those packages should be investigated for further look into the topic.
### Template Execution
`pkg/core` provides the engine mechanism which runs the templates/workflows on inputs. It exposes an `Execute` function which does the task of execution while also doing template clustring. The clustering can also be disbled optionally by the user.
An example of using the core engine is provided below.
```go
engine := core.New(r.options)
engine.SetExecuterOptions(executerOpts)
results := engine.ExecuteWithOpts(finalTemplates, r.hmapInputProvider, true)
```
### Using Nuclei From Go Code
An example of using Nuclei From Go Code to run templates on targets is provided below.
```go
package main
import (
"fmt"
"log"
"os"
"path"
"github.com/logrusorgru/aurora"
"github.com/projectdiscovery/goflags"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/config"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/loader"
"github.com/projectdiscovery/nuclei/v2/pkg/core"
"github.com/projectdiscovery/nuclei/v2/pkg/core/inputs"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/parsers"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/hosterrorscache"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/interactsh"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/protocolinit"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/protocolstate"
"github.com/projectdiscovery/nuclei/v2/pkg/reporting"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
"go.uber.org/ratelimit"
)
func main() {
cache := hosterrorscache.New(30, hosterrorscache.DefaultMaxHostsCount)
defer cache.Close()
mockProgress := &testutils.MockProgressClient{}
reportingClient, _ := reporting.New(&reporting.Options{}, "")
defer reportingClient.Close()
outputWriter := testutils.NewMockOutputWriter()
outputWriter.WriteCallback = func(event *output.ResultEvent) {
fmt.Printf("Got Result: %v\n", event)
}
defaultOpts := types.DefaultOptions()
protocolstate.Init(defaultOpts)
protocolinit.Init(defaultOpts)
defaultOpts.Templates = goflags.StringSlice{"dns/cname-service-detection.yaml"}
defaultOpts.ExcludeTags = config.ReadIgnoreFile().Tags
interactOpts := interactsh.NewDefaultOptions(outputWriter, reportingClient, mockProgress)
interactClient, err := interactsh.New(interactOpts)
if err != nil {
log.Fatalf("Could not create interact client: %s\n", err)
}
defer interactClient.Close()
home, _ := os.UserHomeDir()
catalog := catalog.New(path.Join(home, "nuclei-templates"))
executerOpts := protocols.ExecuterOptions{
Output: outputWriter,
Options: defaultOpts,
Progress: mockProgress,
Catalog: catalog,
IssuesClient: reportingClient,
RateLimiter: ratelimit.New(150),
Interactsh: interactClient,
HostErrorsCache: cache,
Colorizer: aurora.NewAurora(true),
}
engine := core.New(defaultOpts)
engine.SetExecuterOptions(executerOpts)
workflowLoader, err := parsers.NewLoader(&executerOpts)
if err != nil {
log.Fatalf("Could not create workflow loader: %s\n", err)
}
executerOpts.WorkflowLoader = workflowLoader
store, err := loader.New(loader.NewConfig(defaultOpts, catalog, executerOpts))
if err != nil {
log.Fatalf("Could not create loader client: %s\n", err)
}
store.Load()
input := &inputs.SimpleInputProvider{Inputs: []string{"docs.hackerone.com"}}
_ = engine.Execute(store.Templates(), input)
}
```
### Adding a New Protocol
Protocols form the core of Nuclei Engine. All the request types like `http`, `dns`, etc are implemented in form of protocol requests.
A protocol must implement the `Protocol` and `Request` interfaces described above in `pkg/protocols`. We'll take the example of an existing protocol implementation - websocket for this short reference around Nuclei internals.
The code for the websocket protocol is contained in `pkg/protocols/others/websocket`.
Below a high level skeleton of the websocket implementation is provided with all the important parts present.
```go
package websocket
// Request is a request for the Websocket protocol
type Request struct {
// Operators for the current request go here.
operators.Operators `yaml:",inline,omitempty"`
CompiledOperators *operators.Operators `yaml:"-"`
// description: |
// Address contains address for the request
Address string `yaml:"address,omitempty" jsonschema:"title=address for the websocket request,description=Address contains address for the request"`
// declarations here
}
// Compile compiles the request generators preparing any requests possible.
func (r *Request) Compile(options *protocols.ExecuterOptions) error {
r.options = options
// request compilation here as well as client creation
if len(r.Matchers) > 0 || len(r.Extractors) > 0 {
compiled := &r.Operators
if err := compiled.Compile(); err != nil {
return errors.Wrap(err, "could not compile operators")
}
r.CompiledOperators = compiled
}
return nil
}
// Requests returns the total number of requests the rule will perform
func (r *Request) Requests() int {
if r.generator != nil {
return r.generator.NewIterator().Total()
}
return 1
}
// GetID returns the ID for the request if any.
func (r *Request) GetID() string {
return ""
}
// ExecuteWithResults executes the protocol requests and returns results instead of writing them.
func (r *Request) ExecuteWithResults(input string, dynamicValues, previous output.InternalEvent, callback protocols.OutputEventCallback) error {
// payloads init here
if err := r.executeRequestWithPayloads(input, hostname, value, previous, callback); err != nil {
return err
}
return nil
}
// ExecuteWithResults executes the protocol requests and returns results instead of writing them.
func (r *Request) executeRequestWithPayloads(input, hostname string, dynamicValues, previous output.InternalEvent, callback protocols.OutputEventCallback) error {
header := http.Header{}
// make the actual request here after setting all options
event := eventcreator.CreateEventWithAdditionalOptions(r, data, r.options.Options.Debug || r.options.Options.DebugResponse, func(internalWrappedEvent *output.InternalWrappedEvent) {
internalWrappedEvent.OperatorsResult.PayloadValues = payloadValues
})
if r.options.Options.Debug || r.options.Options.DebugResponse {
responseOutput := responseBuilder.String()
gologger.Debug().Msgf("[%s] Dumped Websocket response for %s", r.options.TemplateID, input)
gologger.Print().Msgf("%s", responsehighlighter.Highlight(event.OperatorsResult, responseOutput, r.options.Options.NoColor))
}
callback(event)
return nil
}
func (r *Request) MakeResultEventItem(wrapped *output.InternalWrappedEvent) *output.ResultEvent {
data := &output.ResultEvent{
TemplateID: types.ToString(r.options.TemplateID),
TemplatePath: types.ToString(r.options.TemplatePath),
// ... setting more values for result event
}
return data
}
// Match performs matching operation for a matcher on model and returns:
// true and a list of matched snippets if the matcher type is supports it
// otherwise false and an empty string slice
func (r *Request) Match(data map[string]interface{}, matcher *matchers.Matcher) (bool, []string) {
return protocols.MakeDefaultMatchFunc(data, matcher)
}
// Extract performs extracting operation for an extractor on model and returns true or false.
func (r *Request) Extract(data map[string]interface{}, matcher *extractors.Extractor) map[string]struct{} {
return protocols.MakeDefaultExtractFunc(data, matcher)
}
// MakeResultEvent creates a result event from internal wrapped event
func (r *Request) MakeResultEvent(wrapped *output.InternalWrappedEvent) []*output.ResultEvent {
return protocols.MakeDefaultResultEvent(r, wrapped)
}
// GetCompiledOperators returns a list of the compiled operators
func (r *Request) GetCompiledOperators() []*operators.Operators {
return []*operators.Operators{r.CompiledOperators}
}
// Type returns the type of the protocol request
func (r *Request) Type() templateTypes.ProtocolType {
return templateTypes.WebsocketProtocol
}
```
Almost all of these protocols have boilerplate functions for which default implementations have been provided in the `providers` package. Examples are the implementation of `Match`, `Extract`, `MakeResultEvent`, GetCompiledOperators`, etc which are almost same throughout Nuclei protocols code. It is enough to copy-paste them unless customization is required.
`eventcreator` package offers `CreateEventWithAdditionalOptions` function which can be used to create result events after doing request execution.
Step by step description of how to add a new protocol to Nuclei -
1. Add the protocol implementation in `pkg/protocols` directory. If it's a small protocol with less number of options, considering adding it to the `pkg/protocols/others` directory. Add the enum for the new protocol to `v2/pkg/templates/types/types.go`.
2. Add the protocol request structure to the `Template` structure fields. This is done in `pkg/templates/templates.go` with the corresponding import line.
```go
import (
...
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/others/websocket"
)
// Template is a YAML input file which defines all the requests and
// other metadata for a template.
type Template struct {
...
// description: |
// Websocket contains the Websocket request to make in the template.
RequestsWebsocket []*websocket.Request `yaml:"websocket,omitempty" json:"websocket,omitempty" jsonschema:"title=websocket requests to make,description=Websocket requests to make for the template"`
...
}
```
Also add the protocol case to the `Type` function as well as the `TemplateTypes` array in the same `templates.go` file.
```go
// TemplateTypes is a list of accepted template types
var TemplateTypes = []string{
...
"websocket",
}
// Type returns the type of the template
func (t *Template) Type() templateTypes.ProtocolType {
...
case len(t.RequestsWebsocket) > 0:
return templateTypes.WebsocketProtocol
default:
return ""
}
}
```
3. Add the protocol request to the `Requests` function and `compileProtocolRequests` function in the `compile.go` file in same directory.
```go
// Requests returns the total request count for the template
func (template *Template) Requests() int {
return len(template.RequestsDNS) +
...
len(template.RequestsSSL) +
len(template.RequestsWebsocket)
}
// compileProtocolRequests compiles all the protocol requests for the template
func (template *Template) compileProtocolRequests(options protocols.ExecuterOptions) error {
...
case len(template.RequestsWebsocket) > 0:
requests = template.convertRequestToProtocolsRequest(template.RequestsWebsocket)
}
template.Executer = executer.NewExecuter(requests, &options)
return nil
}
```
That's it, you've added a new protocol to Nuclei. The next good step would be to write integration tests which are described in `integration-tests` and `cmd/integration-tests` directories.
## Project Structure
- [v2/pkg/reporting](./v2/pkg/reporting) - Reporting modules for nuclei.
- [v2/pkg/reporting/exporters/sarif](./v2/pkg/reporting/exporters/sarif) - Sarif Result Exporter
- [v2/pkg/reporting/exporters/markdown](./v2/pkg/reporting/exporters/markdown) - Markdown Result Exporter
- [v2/pkg/reporting/exporters/es](./v2/pkg/reporting/exporters/e) - Elasticsearch Result Exporter
- [v2/pkg/reporting/dedupe](./v2/pkg/reporting/dedupe) - Dedupe module for Results
- [v2/pkg/reporting/trackers/gitlab](./v2/pkg/reporting/trackers/gitlab) - Gitlab Issue Tracker Exporter
- [v2/pkg/reporting/trackers/jira](./v2/pkg/reporting/trackers/jira) - Jira Issue Tracker Exporter
- [v2/pkg/reporting/trackers/github](./v2/pkg/reporting/trackers/github) - Github Issue Tracker Exporter
- [v2/pkg/reporting/format](./v2/pkg/reporting/format) - Result Formatting Functions
- [v2/pkg/parsers](./v2/pkg/parsers) - Implements template as well as workflow loader for initial template discovery, validation and - loading.
- [v2/pkg/types](./v2/pkg/types) - Contains CLI options as well as misc helper functions.
- [v2/pkg/progress](./v2/pkg/progress) - Progress tracking
- [v2/pkg/operators](./v2/pkg/operators) - Operators for Nuclei
- [v2/pkg/operators/common/dsl](./v2/pkg/operators/common/dsl) - DSL functions for Nuclei YAML Syntax
- [v2/pkg/operators/matchers](./v2/pkg/operators/matchers) - Matchers implementation
- [v2/pkg/operators/extractors](./v2/pkg/operators/extractors) - Extractors implementation
- [v2/pkg/catalog](./v2/pkg/catalog) - Template loading from disk helpers
- [v2/pkg/catalog/config](./v2/pkg/catalog/config) - Internal configuration management
- [v2/pkg/catalog/loader](./v2/pkg/catalog/loader) - Implements loading and validation of templates and workflows.
- [v2/pkg/catalog/loader/filter](./v2/pkg/catalog/loader/filter) - Filter filters templates based on tags and paths
- [v2/pkg/output](./v2/pkg/output) - Output module for nuclei
- [v2/pkg/workflows](./v2/pkg/workflows) - Workflow execution logic + declarations
- [v2/pkg/utils](./v2/pkg/utils) - Utility functions
- [v2/pkg/model](./v2/pkg/model) - Template Info + misc
- [v2/pkg/templates](./v2/pkg/templates) - Templates core starting point
- [v2/pkg/templates/cache](./v2/pkg/templates/cache) - Templates cache
- [v2/pkg/protocols](./v2/pkg/protocol) - Protocol Specification
- [v2/pkg/protocols/file](./v2/pkg/protocols/file) - File protocol
- [v2/pkg/protocols/network](./v2/pkg/protocols/network) - Network protocol
- [v2/pkg/protocols/common/expressions](./v2/pkg/protocols/common/expressions) - Expression evaluation + Templating Support
- [v2/pkg/protocols/common/interactsh](./v2/pkg/protocols/common/interactsh) - Interactsh integration
- [v2/pkg/protocols/common/generators](./v2/pkg/protocols/common/generators) - Payload support for Requests (Sniper, etc)
- [v2/pkg/protocols/common/executer](./v2/pkg/protocols/common/executer) - Default Template Executer
- [v2/pkg/protocols/common/replacer](./v2/pkg/protocols/common/replacer) - Template replacement helpers
- [v2/pkg/protocols/common/helpers/eventcreator](./v2/pkg/protocols/common/helpers/eventcreator) - Result event creator
- [v2/pkg/protocols/common/helpers/responsehighlighter](./v2/pkg/protocols/common/helpers/responsehighlighter) - Debug response highlighter
- [v2/pkg/protocols/common/helpers/deserialization](./v2/pkg/protocols/common/helpers/deserialization) - Deserialization helper functions
- [v2/pkg/protocols/common/hosterrorscache](./v2/pkg/protocols/common/hosterrorscache) - Host errors cache for tracking erroring hosts
- [v2/pkg/protocols/offlinehttp](./v2/pkg/protocols/offlinehttp) - Offline http protocol
- [v2/pkg/protocols/http](./v2/pkg/protocols/http) - HTTP protocol
- [v2/pkg/protocols/http/race](./v2/pkg/protocols/http/race) - HTTP Race Module
- [v2/pkg/protocols/http/raw](./v2/pkg/protocols/http/raw) - HTTP Raw Request Support
- [v2/pkg/protocols/headless](./v2/pkg/protocols/headless) - Headless Module
- [v2/pkg/protocols/headless/engine](./v2/pkg/protocols/headless/engine) - Internal Headless implementation
- [v2/pkg/protocols/dns](./v2/pkg/protocols/dns) - DNS protocol
- [v2/pkg/projectfile](./v2/pkg/projectfile) - Project File Implementation
### Notes
1. The matching as well as interim output functionality is a bit complex, we should simplify it a bit as well.

View File

@ -1,4 +1,4 @@
FROM golang:1.17.2-alpine as build-env
FROM golang:1.17.3-alpine as build-env
RUN go install -v github.com/projectdiscovery/nuclei/v2/cmd/nuclei@latest
FROM alpine:3.14

View File

@ -126,7 +126,10 @@ CONFIGURATIONS:
-r, -resolvers string file containing resolver list for nuclei
-sr, -system-resolvers use system DNS resolving as error fallback
-passive enable passive HTTP response processing mode
-ev, -env-vars enable environment variables to be used in template
-ev, -env-vars enable environment variables support to be used in template
-cc, -client-cert client certificate file (PEM-encoded) used for authenticating against scanned hosts
-ck, -client-key client key file (PEM-encoded) used for authenticating against scanned hosts
-ca, -client-ca client certificate authority file (PEM-encoded) used for authenticating against scanned hosts
INTERACTSH:
-iserver, -interactsh-server string interactsh server url for self-hosted instance (default "https://interactsh.com")
@ -162,12 +165,12 @@ DEBUG:
-debug show all requests and responses
-debug-req show all sent requests
-debug-resp show all received responses
-proxy, -proxy-url string URL of the HTTP proxy server
-proxy-socks-url string URL of the SOCKS proxy server
-p, -proxy string[] List of HTTP(s)/SOCKS5 proxy to use (comma separated or file input)
-tlog, -trace-log string file to write sent requests trace log
-elog, -error-log string file to write sent requests error log
-version show nuclei version
-v, -verbose show verbose output
-vv display extra verbose information
-vv display templates loaded for scan
-tv, -templates-version shows the version of the installed nuclei-templates
UPDATE:
@ -277,6 +280,8 @@ We have [a discussion thread around this](https://github.com/projectdiscovery/nu
### Resources
- [Scanning Live Web Applications with Nuclei in CI/CD Pipeline](https://blog.escape.tech/devsecops-part-iii-scanning-live-web-applications/) by [@TristanKalos](https://twitter.com/TristanKalos)
- [Community Powered Scanning with Nuclei](https://blog.projectdiscovery.io/community-powered-scanning-with-nuclei/)
- [Nuclei Unleashed - Quickly write complex exploits](https://blog.projectdiscovery.io/nuclei-unleashed-quickly-write-complex-exploits/)
- [Nuclei - Fuzz all the things](https://blog.projectdiscovery.io/nuclei-fuzz-all-the-things/)

View File

@ -121,8 +121,7 @@ nuclei -h
|templates-version|显示已安装的模板版本|nuclei -templates-version|
|v|显示发送请求的详细信息|nuclei -v|
|version|显示nuclei的版本号|nuclei -version|
|proxy-url|输入代理地址|nuclei -proxy-url hxxp://127.0.0.1:8080|
|proxy-socks-url|输入socks代理地址|nuclei -proxy-socks-url socks5://127.0.0.1:8080|
|proxy|输入代理地址|nuclei -proxy ./proxy.txt|
|random-agent|使用随机的UA|nuclei -random-agent|
|H|自定义请求头|nuclei -H “x-bug-bounty:hacker”|

View File

@ -219,6 +219,32 @@ Headless contains the headless request to make in the template.
<div class="dd">
<code>ssl</code> <i>[]<a href="#sslrequest">ssl.Request</a></i>
</div>
<div class="dt">
SSL contains the SSL request to make in the template.
</div>
<hr />
<div class="dd">
<code>websocket</code> <i>[]<a href="#websocketrequest">websocket.Request</a></i>
</div>
<div class="dt">
Websocket contains the Websocket request to make in the template.
</div>
<hr />
<div class="dd">
<code>workflows</code> <i>[]<a href="#workflowsworkflowtemplate">workflows.WorkflowTemplate</a></i>
</div>
@ -829,7 +855,7 @@ in a combined manner allowing multirequest based matchers.
<div class="dd">
<code>attack</code> <i>string</i>
<code>attack</code> <i><a href="#generatorsattacktypeholder">generators.AttackTypeHolder</a></i>
</div>
<div class="dt">
@ -854,7 +880,7 @@ Valid values:
<div class="dd">
<code>method</code> <i>string</i>
<code>method</code> <i>HTTPMethodTypeHolder</i>
</div>
<div class="dt">
@ -1238,13 +1264,17 @@ Appears in:
- <code><a href="#headlessrequest">headless.Request</a>.matchers</code>
- <code><a href="#sslrequest">ssl.Request</a>.matchers</code>
- <code><a href="#websocketrequest">websocket.Request</a>.matchers</code>
<hr />
<div class="dd">
<code>type</code> <i>string</i>
<code>type</code> <i>MatcherTypeHolder</i>
</div>
<div class="dt">
@ -1554,6 +1584,26 @@ Valid values:
<hr />
<div class="dd">
<code>case-insensitive</code> <i>bool</i>
</div>
<div class="dt">
CaseInsensitive enables case-insensitive matches. Default is false.
Valid values:
- <code>false</code>
- <code>true</code>
</div>
<hr />
@ -1574,6 +1624,10 @@ Appears in:
- <code><a href="#headlessrequest">headless.Request</a>.extractors</code>
- <code><a href="#sslrequest">ssl.Request</a>.extractors</code>
- <code><a href="#websocketrequest">websocket.Request</a>.extractors</code>
<hr />
@ -1604,7 +1658,7 @@ name: cookie-extractor
<div class="dd">
<code>type</code> <i>string</i>
<code>type</code> <i>TypeHolder</i>
</div>
<div class="dt">
@ -1833,6 +1887,42 @@ in the next request for some protocols (like HTTP).
<hr />
<div class="dd">
<code>case-insensitive</code> <i>bool</i>
</div>
<div class="dt">
CaseInsensitive enables case-insensitive extractions. Default is false.
Valid values:
- <code>false</code>
- <code>true</code>
</div>
<hr />
## generators.AttackTypeHolder
AttackTypeHolder is used to hold internal type of the protocol
Appears in:
- <code><a href="#httprequest">http.Request</a>.attack</code>
- <code><a href="#networkrequest">network.Request</a>.attack</code>
- <code><a href="#websocketrequest">websocket.Request</a>.attack</code>
@ -1953,12 +2043,12 @@ name: '{{FQDN}}'
<div class="dd">
<code>type</code> <i>string</i>
<code>type</code> <i>DNSRequestTypeHolder</i>
</div>
<div class="dt">
Type is the type of DNS request to make.
RequestType is the type of DNS request to make.
Valid values:
@ -2035,6 +2125,43 @@ retries: 5
```
</div>
<hr />
<div class="dd">
<code>trace</code> <i>bool</i>
</div>
<div class="dt">
Trace performs a trace operation for the target.
</div>
<hr />
<div class="dd">
<code>trace-max-recursion</code> <i>int</i>
</div>
<div class="dt">
TraceMaxRecursion is the number of max recursion allowed for trace operations
Examples:
```yaml
# Use a retry of 100 to 150 generally
trace-max-recursion: 100
```
</div>
<hr />
@ -2318,7 +2445,7 @@ host:
<div class="dd">
<code>attack</code> <i>string</i>
<code>attack</code> <i><a href="#generatorsattacktypeholder">generators.AttackTypeHolder</a></i>
</div>
<div class="dt">
@ -2392,6 +2519,31 @@ read-size: 2048
```
</div>
<hr />
<div class="dd">
<code>read-all</code> <i>bool</i>
</div>
<div class="dt">
ReadAll determines if the data stream should be read till the end regardless of the size
Default value for read-all is false.
Examples:
```yaml
read-all: false
```
</div>
<hr />
@ -2494,7 +2646,7 @@ data: hex_decode('50494e47')
<div class="dd">
<code>type</code> <i>string</i>
<code>type</code> <i>NetworkInputTypeHolder</i>
</div>
<div class="dt">
@ -2727,7 +2879,7 @@ Description is the optional description of the headless action
<div class="dd">
<code>action</code> <i>string</i>
<code>action</code> <i>ActionTypeHolder</i>
</div>
<div class="dt">
@ -2787,6 +2939,303 @@ Valid values:
## ssl.Request
Request is a request for the SSL protocol
Appears in:
- <code><a href="#template">Template</a>.ssl</code>
<hr />
<div class="dd">
<code>matchers</code> <i>[]<a href="#matchersmatcher">matchers.Matcher</a></i>
</div>
<div class="dt">
Matchers contains the detection mechanism for the request to identify
whether the request was successful by doing pattern matching
on request/responses.
Multiple matchers can be combined with `matcher-condition` flag
which accepts either `and` or `or` as argument.
</div>
<hr />
<div class="dd">
<code>extractors</code> <i>[]<a href="#extractorsextractor">extractors.Extractor</a></i>
</div>
<div class="dt">
Extractors contains the extraction mechanism for the request to identify
and extract parts of the response.
</div>
<hr />
<div class="dd">
<code>matchers-condition</code> <i>string</i>
</div>
<div class="dt">
MatchersCondition is the condition between the matchers. Default is OR.
Valid values:
- <code>and</code>
- <code>or</code>
</div>
<hr />
<div class="dd">
<code>address</code> <i>string</i>
</div>
<div class="dt">
Address contains address for the request
</div>
<hr />
## websocket.Request
Request is a request for the Websocket protocol
Appears in:
- <code><a href="#template">Template</a>.websocket</code>
<hr />
<div class="dd">
<code>matchers</code> <i>[]<a href="#matchersmatcher">matchers.Matcher</a></i>
</div>
<div class="dt">
Matchers contains the detection mechanism for the request to identify
whether the request was successful by doing pattern matching
on request/responses.
Multiple matchers can be combined with `matcher-condition` flag
which accepts either `and` or `or` as argument.
</div>
<hr />
<div class="dd">
<code>extractors</code> <i>[]<a href="#extractorsextractor">extractors.Extractor</a></i>
</div>
<div class="dt">
Extractors contains the extraction mechanism for the request to identify
and extract parts of the response.
</div>
<hr />
<div class="dd">
<code>matchers-condition</code> <i>string</i>
</div>
<div class="dt">
MatchersCondition is the condition between the matchers. Default is OR.
Valid values:
- <code>and</code>
- <code>or</code>
</div>
<hr />
<div class="dd">
<code>address</code> <i>string</i>
</div>
<div class="dt">
Address contains address for the request
</div>
<hr />
<div class="dd">
<code>inputs</code> <i>[]<a href="#websocketinput">websocket.Input</a></i>
</div>
<div class="dt">
Inputs contains inputs for the websocket protocol
</div>
<hr />
<div class="dd">
<code>headers</code> <i>map[string]string</i>
</div>
<div class="dt">
Headers contains headers for the request.
</div>
<hr />
<div class="dd">
<code>attack</code> <i><a href="#generatorsattacktypeholder">generators.AttackTypeHolder</a></i>
</div>
<div class="dt">
Attack is the type of payload combinations to perform.
Sniper is each payload once, pitchfork combines multiple payload sets and clusterbomb generates
permutations and combinations for all payloads.
Valid values:
- <code>sniper</code>
- <code>pitchfork</code>
- <code>clusterbomb</code>
</div>
<hr />
<div class="dd">
<code>payloads</code> <i>map[string]interface{}</i>
</div>
<div class="dt">
Payloads contains any payloads for the current request.
Payloads support both key-values combinations where a list
of payloads is provided, or optionally a single file can also
be provided as payload which will be read on run-time.
</div>
<hr />
## websocket.Input
Appears in:
- <code><a href="#websocketrequest">websocket.Request</a>.inputs</code>
<hr />
<div class="dd">
<code>data</code> <i>string</i>
</div>
<div class="dt">
Data is the data to send as the input.
It supports DSL Helper Functions as well as normal expressions.
Examples:
```yaml
data: TEST
```
```yaml
data: hex_decode('50494e47')
```
</div>
<hr />
<div class="dd">
<code>name</code> <i>string</i>
</div>
<div class="dt">
Name is the optional name of the data read to provide matching on.
Examples:
```yaml
name: prefix
```
</div>
<hr />
## workflows.WorkflowTemplate
Appears in:

View File

@ -0,0 +1,16 @@
id: basic-get-case-insensitive
info:
name: Basic GET Request
author: pdteam
severity: info
requests:
- method: GET
path:
- "{{BaseURL}}"
matchers:
- type: word
case-insensitive: true
words:
- "ThIS is TEsT MAtcHEr TExT"

View File

@ -0,0 +1,23 @@
id: basic-get-redirects-chain-headers
info:
name: Basic GET Redirects Request With Chain header
author: pdteam
severity: info
requests:
- method: GET
path:
- "{{BaseURL}}"
redirects: true
max-redirects: 3
matchers-condition: and
matchers:
- type: word
part: header
words:
- "TestRedirectHeaderMatch"
- type: status
status:
- 302

View File

@ -0,0 +1,19 @@
id: interactsh-integration-test
info:
name: Interactsh Integration Test
author: pdteam
severity: info
requests:
- method: GET
path:
- "{{BaseURL}}"
headers:
url: 'http://{{interactsh-url}}'
matchers:
- type: word
part: interactsh_protocol # Confirms the HTTP Interaction
words:
- "http"

View File

@ -7,7 +7,7 @@ info:
requests:
- raw:
- |
- |+
GET / HTTP/1.1
Host:
Content-Length: 4

View File

@ -0,0 +1,10 @@
id: workflow-example
info:
name: Test Workflow Template
author: pdteam
severity: info
workflows:
- template: workflow/match-1.yaml
- template: workflow/match-2.yaml

View File

@ -0,0 +1,11 @@
id: condition-matched-workflow
info:
name: Condition Matched Workflow
author: pdteam
severity: info
workflows:
- template: workflow/match-1.yaml
subtemplates:
- template: workflow/match-2.yaml

View File

@ -0,0 +1,17 @@
id: basic-get-headers
info:
name: Basic GET Headers Request
author: pdteam
severity: info
requests:
- method: GET
path:
- "{{BaseURL}}"
headers:
test: nuclei
matchers:
- type: word
words:
- "This is test headers matcher text"

View File

@ -0,0 +1,15 @@
id: basic-get
info:
name: Basic GET Request
author: pdteam
severity: info
requests:
- method: GET
path:
- "{{BaseURL}}"
matchers:
- type: word
words:
- "This is test matcher text"

View File

@ -0,0 +1,2 @@
loader/get.yaml
loader/get-headers.yaml

View File

@ -0,0 +1,2 @@
loader/basic.yaml
loader/condition-matched.yaml

View File

@ -26,8 +26,8 @@ gitlab:
username: test-username
# token is the token for gitlab account.
token: test-token
# project-id is the ID of the repository.
project-id: 1234
# project-name is the name/id of the project(repository).
project-name: "1234"
# issue-label is the label of the created issue type
issue-label: bug

View File

@ -28,8 +28,8 @@ gitlab:
username: test-username
# token is the token for gitlab account.
token: test-token
# project-id is the ID of the repository.
project-id: 1234
# project-name is the name/id of the project(repository).
project-name: "1234"
# issue-label is the label of the created issue type
issue-label: bug

View File

@ -0,0 +1,16 @@
id: basic-request
info:
name: Basic Request
author: pdteam
severity: info
websocket:
- address: '{{Scheme}}://{{Hostname}}'
inputs:
- data: hello
matchers:
- type: word
words:
- world
part: response

View File

@ -0,0 +1,16 @@
id: basic-cswsh-request
info:
name: Basic cswsh Request
author: pdteam
severity: info
websocket:
- address: '{{Scheme}}://{{Hostname}}'
headers:
Origin: 'http://evil.com'
matchers:
- type: word
words:
- true
part: success

View File

@ -0,0 +1,16 @@
id: basic-nocswsh-request
info:
name: Basic Non-Vulnerable cswsh Request
author: pdteam
severity: info
websocket:
- address: '{{Scheme}}://{{Hostname}}'
headers:
Origin: 'http://evil.com'
matchers:
- type: word
words:
- true
part: success

View File

@ -0,0 +1,16 @@
id: basic-request-path
info:
name: Basic Request Path
author: pdteam
severity: info
websocket:
- address: '{{Scheme}}://{{Hostname}}'
inputs:
- data: hello
matchers:
- type: word
words:
- world
part: response

View File

@ -130,15 +130,8 @@
"description": "Name of the extractor"
},
"type": {
"enum": [
"regex",
"kval",
"json",
"xpath"
],
"type": "string",
"title": "type of the extractor",
"description": "Type of the extractor"
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/extractors.TypeHolder"
},
"regex": {
"items": {
@ -191,26 +184,35 @@
"type": "boolean",
"title": "mark extracted value for internal variable use",
"description": "Internal when set to true will allow using the value extracted in the next request for some protocols"
},
"case-insensitive": {
"type": "boolean",
"title": "use case insensitive extract",
"description": "use case insensitive extract"
}
},
"additionalProperties": false,
"type": "object"
},
"extractors.TypeHolder": {
"enum": [
"regex",
"kval",
"xpath",
"json"
],
"type": "string",
"title": "type of the extractor",
"description": "Type of the extractor"
},
"matchers.Matcher": {
"required": [
"type"
],
"properties": {
"type": {
"enum": [
"status",
"size",
"word",
"regex",
"binary",
"dsl"
],
"type": "string",
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/matchers.MatcherTypeHolder",
"title": "type of matcher",
"description": "Type of the matcher"
},
@ -293,11 +295,55 @@
"type": "string",
"title": "encoding for word field",
"description": "Optional encoding for the word fields"
},
"case-insensitive": {
"type": "boolean",
"title": "use case insensitive match",
"description": "use case insensitive match"
}
},
"additionalProperties": false,
"type": "object"
},
"matchers.MatcherTypeHolder": {
"enum": [
"word",
"regex",
"binary",
"status",
"size",
"dsl"
],
"type": "string",
"title": "type of the matcher",
"description": "Type of the matcher,enum=status,enum=size,enum=word,enum=regex,enum=binary,enum=dsl"
},
"generators.AttackTypeHolder": {
"enum": [
"batteringram",
"pitchfork",
"clusterbomb"
],
"type": "string",
"title": "type of the attack",
"description": "Type of the attack"
},
"dns.DNSRequestTypeHolder": {
"enum": [
"A",
"NS",
"DS",
"CNAME",
"SOA",
"PTR",
"MX",
"TXT",
"AAAA"
],
"type": "string",
"title": "type of DNS request to make",
"description": "Type is the type of DNS request to make,enum=A,enum=NS,enum=DS,enum=CNAME,enum=SOA,enum=PTR,enum=MX,enum=TXT,enum=AAAA"
},
"dns.Request": {
"properties": {
"matchers": {
@ -336,18 +382,8 @@
"description": "Name is the Hostname to make DNS request for"
},
"type": {
"enum": [
"A",
"NS",
"DS",
"CNAME",
"SOA",
"PTR",
"MX",
"TXT",
"AAAA"
],
"type": "string",
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/dns.DNSRequestTypeHolder",
"title": "type of dns request to make",
"description": "Type is the type of DNS request to make"
},
@ -369,6 +405,16 @@
"title": "retries for dns request",
"description": "Retries is the number of retries for the DNS request"
},
"trace": {
"type": "boolean",
"title": "trace operation",
"description": "Trace performs a trace operation for the target."
},
"trace-max-recursion": {
"type": "integer",
"title": "trace-max-recursion level for dns request",
"description": "TraceMaxRecursion is the number of max recursion allowed for trace operations"
},
"recursion": {
"type": "boolean",
"title": "recurse all servers",
@ -519,6 +565,16 @@
"description": "Description of the headless action"
},
"action": {
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/engine.ActionTypeHolder",
"title": "action to perform",
"description": "Type of actions to perform"
}
},
"additionalProperties": false,
"type": "object"
},
"engine.ActionTypeHolder": {
"enum": [
"navigate",
"script",
@ -532,7 +588,7 @@
"waitload",
"getresource",
"extract",
"setmethod",
"set-method",
"addheader",
"setheader",
"deleteheader",
@ -540,15 +596,29 @@
"waitevent",
"keyboard",
"debug",
"sleep"
"sleep",
"waitvisible"
],
"type": "string",
"title": "action to perform",
"description": "Type of actions to perform"
}
"description": "Type of actions to perform,enum=navigate,enum=script,enum=click,enum=rightclick,enum=text,enum=screenshot,enum=time,enum=select,enum=files,enum=waitload,enum=getresource,enum=extract,enum=setmethod,enum=addheader,enum=setheader,enum=deleteheader,enum=setbody,enum=waitevent,enum=keyboard,enum=debug,enum=sleep"
},
"additionalProperties": false,
"type": "object"
"http.HTTPMethodTypeHolder": {
"enum": [
"GET",
"HEAD",
"POST",
"PUT",
"DELETE",
"CONNECT",
"OPTIONS",
"TRACE",
"PATCH",
"PURGE"
],
"type": "string",
"title": "method is the HTTP request method",
"description": "Method is the HTTP Request Method,enum=GET,enum=HEAD,enum=POST,enum=PUT,enum=DELETE,enum=CONNECT,enum=OPTIONS,enum=TRACE,enum=PATCH,enum=PURGE"
},
"http.Request": {
"properties": {
@ -605,29 +675,14 @@
"description": "Optional name for the HTTP Request"
},
"attack": {
"enum": [
"batteringram",
"pitchfork",
"clusterbomb"
],
"type": "string",
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/generators.AttackTypeHolder",
"title": "attack is the payload combination",
"description": "Attack is the type of payload combinations to perform"
},
"method": {
"enum": [
"GET",
"HEAD",
"POST",
"PUT",
"DELETE",
"CONNECT",
"OPTIONS",
"TRACE",
"PATCH",
"PURGE"
],
"type": "string",
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/http.HTTPMethodTypeHolder",
"title": "method is the http request method",
"description": "Method is the HTTP Request Method"
},
@ -738,11 +793,8 @@
"description": "Data is the data to send as the input"
},
"type": {
"enum": [
"hex",
"text"
],
"type": "string",
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/network.NetworkInputTypeHolder",
"title": "type is the type of input data",
"description": "Type of input specified in data field"
},
@ -760,6 +812,15 @@
"additionalProperties": false,
"type": "object"
},
"network.NetworkInputTypeHolder": {
"enum": [
"hex",
"text"
],
"type": "string",
"title": "type is the type of input data",
"description": "description=Type of input specified in data field,enum=hex,enum=text"
},
"network.Request": {
"properties": {
"id": {
@ -776,12 +837,7 @@
"description": "Host to send network requests to"
},
"attack": {
"enum": [
"batteringram",
"pitchfork",
"clusterbomb"
],
"type": "string",
"$ref": "#/definitions/generators.AttackTypeHolder",
"title": "attack is the payload combination",
"description": "Attack is the type of payload combinations to perform"
},
@ -809,6 +865,11 @@
"title": "size of network response to read",
"description": "Size of response to read at the end. Default is 1024 bytes"
},
"read-all": {
"type": "boolean",
"title": "read all response stream",
"description": "Read all response stream till the server stops sending"
},
"matchers": {
"items": {
"$ref": "#/definitions/matchers.Matcher"
@ -838,6 +899,128 @@
"additionalProperties": false,
"type": "object"
},
"ssl.Request": {
"properties": {
"matchers": {
"items": {
"$ref": "#/definitions/matchers.Matcher"
},
"type": "array",
"title": "matchers to run on response",
"description": "Detection mechanism to identify whether the request was successful by doing pattern matching"
},
"extractors": {
"items": {
"$ref": "#/definitions/extractors.Extractor"
},
"type": "array",
"title": "extractors to run on response",
"description": "Extractors contains the extraction mechanism for the request to identify and extract parts of the response"
},
"matchers-condition": {
"enum": [
"and",
"or"
],
"type": "string",
"title": "condition between the matchers",
"description": "Conditions between the matchers"
},
"address": {
"type": "string",
"title": "address for the ssl request",
"description": "Address contains address for the request"
}
},
"additionalProperties": false,
"type": "object"
},
"websocket.Input": {
"properties": {
"data": {
"type": "string",
"title": "data to send as input",
"description": "Data is the data to send as the input"
},
"name": {
"type": "string",
"title": "optional name for data read",
"description": "Optional name of the data read to provide matching on"
}
},
"additionalProperties": false,
"type": "object"
},
"websocket.Request": {
"properties": {
"matchers": {
"items": {
"$ref": "#/definitions/matchers.Matcher"
},
"type": "array",
"title": "matchers to run on response",
"description": "Detection mechanism to identify whether the request was successful by doing pattern matching"
},
"extractors": {
"items": {
"$ref": "#/definitions/extractors.Extractor"
},
"type": "array",
"title": "extractors to run on response",
"description": "Extractors contains the extraction mechanism for the request to identify and extract parts of the response"
},
"matchers-condition": {
"enum": [
"and",
"or"
],
"type": "string",
"title": "condition between the matchers",
"description": "Conditions between the matchers"
},
"address": {
"type": "string",
"title": "address for the websocket request",
"description": "Address contains address for the request"
},
"inputs": {
"items": {
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/websocket.Input"
},
"type": "array",
"title": "inputs for the websocket request",
"description": "Inputs contains any input/output for the current request"
},
"headers": {
"patternProperties": {
".*": {
"type": "string"
}
},
"type": "object",
"title": "headers contains the request headers",
"description": "Headers contains headers for the request"
},
"attack": {
"$ref": "#/definitions/generators.AttackTypeHolder",
"title": "attack is the payload combination",
"description": "Attack is the type of payload combinations to perform"
},
"payloads": {
"patternProperties": {
".*": {
"additionalProperties": true
}
},
"type": "object",
"title": "payloads for the webosocket request",
"description": "Payloads contains any payloads for the current request"
}
},
"additionalProperties": false,
"type": "object"
},
"templates.Template": {
"required": [
"id",
@ -845,6 +1028,7 @@
],
"properties": {
"id": {
"pattern": "^([a-zA-Z0-9]+[-_])*[a-zA-Z0-9]+$",
"type": "string",
"title": "id of the template",
"description": "The Unique ID for the template",
@ -903,6 +1087,24 @@
"title": "headless requests to make",
"description": "Headless requests to make for the template"
},
"ssl": {
"items": {
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/ssl.Request"
},
"type": "array",
"title": "ssl requests to make",
"description": "SSL requests to make for the template"
},
"websocket": {
"items": {
"$schema": "http://json-schema.org/draft-04/schema#",
"$ref": "#/definitions/websocket.Request"
},
"type": "array",
"title": "websocket requests to make",
"description": "Websocket requests to make for the template"
},
"workflows": {
"items": {
"$schema": "http://json-schema.org/draft-04/schema#",

View File

@ -11,7 +11,7 @@ import (
"github.com/logrusorgru/aurora"
"github.com/pkg/errors"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
var (

View File

@ -1,7 +1,7 @@
package main
import (
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
var dnsTestCases = map[string]testutils.TestCase{

View File

@ -11,7 +11,7 @@ import (
"github.com/julienschmidt/httprouter"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
var httpTestcases = map[string]testutils.TestCase{
@ -31,7 +31,37 @@ var httpTestcases = map[string]testutils.TestCase{
"http/raw-unsafe-request.yaml": &httpRawUnsafeRequest{},
"http/request-condition.yaml": &httpRequestCondition{},
"http/request-condition-new.yaml": &httpRequestCondition{},
"http/interactsh.yaml": &httpInteractshRequest{},
"http/self-contained.yaml": &httpRequestSelContained{},
"http/get-case-insensitive.yaml": &httpGetCaseInsensitive{},
"http/get.yaml,http/get-case-insensitive.yaml": &httpGetCaseInsensitiveCluster{},
"http/get-redirects-chain-headers.yaml": &httpGetRedirectsChainHeaders{},
}
type httpInteractshRequest struct{}
// Executes executes a test case and returns an error if occurred
func (h *httpInteractshRequest) Execute(filePath string) error {
router := httprouter.New()
router.GET("/", httprouter.Handle(func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
value := r.Header.Get("url")
if value != "" {
if resp, _ := http.DefaultClient.Get(value); resp != nil {
resp.Body.Close()
}
}
}))
ts := httptest.NewServer(router)
defer ts.Close()
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug)
if err != nil {
return err
}
if len(results) != 1 {
return errIncorrectResultsCount(results)
}
return nil
}
type httpGetHeaders struct{}
@ -526,3 +556,75 @@ func (h *httpRequestSelContained) Execute(filePath string) error {
}
return nil
}
type httpGetCaseInsensitive struct{}
// Execute executes a test case and returns an error if occurred
func (h *httpGetCaseInsensitive) Execute(filePath string) error {
router := httprouter.New()
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
fmt.Fprintf(w, "THIS IS TEST MATCHER TEXT")
})
ts := httptest.NewServer(router)
defer ts.Close()
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug)
if err != nil {
return err
}
if len(results) != 1 {
return errIncorrectResultsCount(results)
}
return nil
}
type httpGetCaseInsensitiveCluster struct{}
// Execute executes a test case and returns an error if occurred
func (h *httpGetCaseInsensitiveCluster) Execute(filesPath string) error {
router := httprouter.New()
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
fmt.Fprintf(w, "This is test matcher text")
})
ts := httptest.NewServer(router)
defer ts.Close()
files := strings.Split(filesPath, ",")
results, err := testutils.RunNucleiTemplateAndGetResults(files[0], ts.URL, debug, "-t", files[1])
if err != nil {
return err
}
if len(results) != 2 {
return errIncorrectResultsCount(results)
}
return nil
}
type httpGetRedirectsChainHeaders struct{}
// Execute executes a test case and returns an error if occurred
func (h *httpGetRedirectsChainHeaders) Execute(filePath string) error {
router := httprouter.New()
router.GET("/", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
http.Redirect(w, r, "/redirected", http.StatusFound)
})
router.GET("/redirected", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
w.Header().Set("Secret", "TestRedirectHeaderMatch")
http.Redirect(w, r, "/final", http.StatusFound)
})
router.GET("/final", func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
_, _ = w.Write([]byte("ok"))
})
ts := httptest.NewServer(router)
defer ts.Close()
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, ts.URL, debug)
if err != nil {
return err
}
if len(results) != 1 {
return errIncorrectResultsCount(results)
}
return nil
}

View File

@ -6,7 +6,7 @@ import (
"strings"
"github.com/logrusorgru/aurora"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
var (
@ -26,6 +26,8 @@ func main() {
"network": networkTestcases,
"dns": dnsTestCases,
"workflow": workflowTestcases,
"loader": loaderTestcases,
"websocket": websocketTestCases,
}
for proto, tests := range protocolTests {
if protocol == "" || protocol == proto {
@ -50,5 +52,5 @@ func main() {
}
func errIncorrectResultsCount(results []string) error {
return fmt.Errorf("incorrect number of results %s", strings.Join(results, "\n\t"))
return fmt.Errorf("incorrect number of results \n\t%s", strings.Join(results, "\n\t"))
}

View File

@ -0,0 +1,123 @@
package main
import (
"fmt"
"net/http"
"net/http/httptest"
"os"
"strings"
"github.com/julienschmidt/httprouter"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
var loaderTestcases = map[string]testutils.TestCase{
"loader/template-list.yaml": &remoteTemplateList{},
"loader/workflow-list.yaml": &remoteWorkflowList{},
"loader/nonexistent-template-list.yaml": &nonExistentTemplateList{},
"loader/nonexistent-workflow-list.yaml": &nonExistentWorkflowList{},
}
type remoteTemplateList struct{}
// Execute executes a test case and returns an error if occurred
func (h *remoteTemplateList) Execute(templateList string) error {
router := httprouter.New()
router.GET("/", httprouter.Handle(func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
fmt.Fprintf(w, "This is test matcher text")
if strings.EqualFold(r.Header.Get("test"), "nuclei") {
fmt.Fprintf(w, "This is test headers matcher text")
}
}))
router.GET("/template_list", httprouter.Handle(func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
file, err := os.ReadFile(templateList)
if err != nil {
w.WriteHeader(500)
}
_, err = w.Write(file)
if err != nil {
w.WriteHeader(500)
}
}))
ts := httptest.NewServer(router)
defer ts.Close()
results, err := testutils.RunNucleiBareArgsAndGetResults(debug, "-target", ts.URL, "-tu", ts.URL+"/template_list")
if err != nil {
return err
}
if len(results) != 2 {
return errIncorrectResultsCount(results)
}
return nil
}
type remoteWorkflowList struct{}
// Execute executes a test case and returns an error if occurred
func (h *remoteWorkflowList) Execute(workflowList string) error {
router := httprouter.New()
router.GET("/", httprouter.Handle(func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
fmt.Fprintf(w, "This is test matcher text")
if strings.EqualFold(r.Header.Get("test"), "nuclei") {
fmt.Fprintf(w, "This is test headers matcher text")
}
}))
router.GET("/workflow_list", httprouter.Handle(func(w http.ResponseWriter, r *http.Request, _ httprouter.Params) {
file, err := os.ReadFile(workflowList)
if err != nil {
w.WriteHeader(500)
}
_, err = w.Write(file)
if err != nil {
w.WriteHeader(500)
}
}))
ts := httptest.NewServer(router)
defer ts.Close()
results, err := testutils.RunNucleiBareArgsAndGetResults(debug, "-target", ts.URL, "-wu", ts.URL+"/workflow_list")
if err != nil {
return err
}
if len(results) != 3 {
return errIncorrectResultsCount(results)
}
return nil
}
type nonExistentTemplateList struct{}
// Execute executes a test case and returns an error if occurred
func (h *nonExistentTemplateList) Execute(nonExistingTemplateList string) error {
router := httprouter.New()
ts := httptest.NewServer(router)
defer ts.Close()
_, err := testutils.RunNucleiBareArgsAndGetResults(debug, "-target", ts.URL, "-tu", ts.URL+"/404")
if err == nil {
return fmt.Errorf("expected error for nonexisting workflow url")
}
return nil
}
type nonExistentWorkflowList struct{}
// Execute executes a test case and returns an error if occurred
func (h *nonExistentWorkflowList) Execute(nonExistingWorkflowList string) error {
router := httprouter.New()
ts := httptest.NewServer(router)
defer ts.Close()
_, err := testutils.RunNucleiBareArgsAndGetResults(debug, "-target", ts.URL, "-wu", ts.URL+"/404")
if err == nil {
return fmt.Errorf("expected error for nonexisting workflow url")
}
return nil
}

View File

@ -3,7 +3,7 @@ package main
import (
"net"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
var networkTestcases = map[string]testutils.TestCase{

View File

@ -0,0 +1,114 @@
package main
import (
"net"
"strings"
"github.com/gobwas/ws/wsutil"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
var websocketTestCases = map[string]testutils.TestCase{
"websocket/basic.yaml": &websocketBasic{},
"websocket/cswsh.yaml": &websocketCswsh{},
"websocket/no-cswsh.yaml": &websocketNoCswsh{},
"websocket/path.yaml": &websocketWithPath{},
}
type websocketBasic struct{}
// Execute executes a test case and returns an error if occurred
func (h *websocketBasic) Execute(filePath string) error {
connHandler := func(conn net.Conn) {
for {
msg, op, _ := wsutil.ReadClientData(conn)
if string(msg) != string("hello") {
return
}
_ = wsutil.WriteServerMessage(conn, op, []byte("world"))
}
}
originValidate := func(origin string) bool {
return true
}
ts := testutils.NewWebsocketServer("", connHandler, originValidate)
defer ts.Close()
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, strings.ReplaceAll(ts.URL, "http", "ws"), debug)
if err != nil {
return err
}
if len(results) != 1 {
return errIncorrectResultsCount(results)
}
return nil
}
type websocketCswsh struct{}
// Execute executes a test case and returns an error if occurred
func (h *websocketCswsh) Execute(filePath string) error {
connHandler := func(conn net.Conn) {
}
originValidate := func(origin string) bool {
return true
}
ts := testutils.NewWebsocketServer("", connHandler, originValidate)
defer ts.Close()
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, strings.ReplaceAll(ts.URL, "http", "ws"), debug)
if err != nil {
return err
}
if len(results) != 1 {
return errIncorrectResultsCount(results)
}
return nil
}
type websocketNoCswsh struct{}
// Execute executes a test case and returns an error if occurred
func (h *websocketNoCswsh) Execute(filePath string) error {
connHandler := func(conn net.Conn) {
}
originValidate := func(origin string) bool {
return origin == "https://google.com"
}
ts := testutils.NewWebsocketServer("", connHandler, originValidate)
defer ts.Close()
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, strings.ReplaceAll(ts.URL, "http", "ws"), debug)
if err != nil {
return err
}
if len(results) != 0 {
return errIncorrectResultsCount(results)
}
return nil
}
type websocketWithPath struct{}
// Execute executes a test case and returns an error if occurred
func (h *websocketWithPath) Execute(filePath string) error {
connHandler := func(conn net.Conn) {
}
originValidate := func(origin string) bool {
return origin == "https://google.com"
}
ts := testutils.NewWebsocketServer("/test", connHandler, originValidate)
defer ts.Close()
results, err := testutils.RunNucleiTemplateAndGetResults(filePath, strings.ReplaceAll(ts.URL, "http", "ws"), debug)
if err != nil {
return err
}
if len(results) != 0 {
return errIncorrectResultsCount(results)
}
return nil
}

View File

@ -7,7 +7,7 @@ import (
"github.com/julienschmidt/httprouter"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
var workflowTestcases = map[string]testutils.TestCase{

View File

@ -9,6 +9,7 @@ import (
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/nuclei/v2/internal/runner"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
templateTypes "github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
)
@ -54,8 +55,10 @@ on extensive configurability, massive extensibility and ease of use.`)
createGroup(flagSet, "templates", "Templates",
flagSet.StringSliceVarP(&options.Templates, "templates", "t", []string{}, "template or template directory paths to include in the scan"),
flagSet.StringSliceVarP(&options.TemplateURLs, "template-url", "tu", []string{}, "URL containing list of templates to run"),
flagSet.BoolVarP(&options.NewTemplates, "new-templates", "nt", false, "run only new templates added in latest nuclei-templates release"),
flagSet.StringSliceVarP(&options.Workflows, "workflows", "w", []string{}, "workflow or workflow directory paths to include in the scan"),
flagSet.StringSliceVarP(&options.WorkflowURLs, "workflow-url", "wu", []string{}, "URL containing list of workflows to run"),
flagSet.BoolVar(&options.Validate, "validate", false, "validate the passed templates to nuclei"),
flagSet.BoolVar(&options.TemplateList, "tl", false, "list all available templates"),
)
@ -68,7 +71,9 @@ on extensive configurability, massive extensibility and ease of use.`)
flagSet.StringSliceVarP(&options.ExcludedTemplates, "exclude-templates", "et", []string{}, "template or template directory paths to exclude"),
flagSet.VarP(&options.Severities, "severity", "s", fmt.Sprintf("Templates to run based on severity. Possible values: %s", severity.GetSupportedSeverities().String())),
flagSet.VarP(&options.ExcludeSeverities, "exclude-severity", "es", fmt.Sprintf("Templates to exclude based on severity. Possible values: %s", severity.GetSupportedSeverities().String())),
flagSet.NormalizedStringSliceVarP(&options.Author, "author", "a", []string{}, "execute templates that are (co-)created by the specified authors"),
flagSet.VarP(&options.Protocols, "type", "pt", fmt.Sprintf("protocol types to be executed. Possible values: %s", templateTypes.GetSupportedProtocolTypes())),
flagSet.VarP(&options.ExcludeProtocols, "exclude-type", "ept", fmt.Sprintf("protocol types to not be executed. Possible values: %s", templateTypes.GetSupportedProtocolTypes())),
flagSet.NormalizedStringSliceVarP(&options.Authors, "author", "a", []string{}, "execute templates that are (co-)created by the specified authors"),
)
createGroup(flagSet, "output", "Output",
@ -80,6 +85,7 @@ on extensive configurability, massive extensibility and ease of use.`)
flagSet.BoolVarP(&options.NoMeta, "no-meta", "nm", false, "don't display match metadata"),
flagSet.BoolVarP(&options.NoTimestamp, "no-timestamp", "nts", false, "don't display timestamp metadata in CLI output"),
flagSet.StringVarP(&options.ReportingDB, "report-db", "rdb", "", "local nuclei reporting database (always use this to persist report data)"),
flagSet.BoolVarP(&options.MatcherStatus, "matcher-status", "ms", false, "show optional match failure status"),
flagSet.StringVarP(&options.MarkdownExportDirectory, "markdown-export", "me", "", "directory to export results in markdown format"),
flagSet.StringVarP(&options.SarifExport, "sarif-export", "se", "", "file to export results in SARIF format"),
)
@ -93,6 +99,9 @@ on extensive configurability, massive extensibility and ease of use.`)
flagSet.BoolVarP(&options.SystemResolvers, "system-resolvers", "sr", false, "use system DNS resolving as error fallback"),
flagSet.BoolVar(&options.OfflineHTTP, "passive", false, "enable passive HTTP response processing mode"),
flagSet.BoolVarP(&options.EnvironmentVariables, "env-vars", "ev", false, "enable environment variables to be used in template"),
flagSet.StringVarP(&options.ClientCertFile, "client-cert", "cc", "", "client certificate file (PEM-encoded) used for authenticating against scanned hosts"),
flagSet.StringVarP(&options.ClientKeyFile, "client-key", "ck", "", "client key file (PEM-encoded) used for authenticating against scanned hosts"),
flagSet.StringVarP(&options.ClientCAFile, "client-ca", "ca", "", "client certificate authority file (PEM-encoded) used for authenticating against scanned hosts"),
)
createGroup(flagSet, "interactsh", "interactsh",
@ -101,7 +110,7 @@ on extensive configurability, massive extensibility and ease of use.`)
flagSet.IntVar(&options.InteractionsCacheSize, "interactions-cache-size", 5000, "number of requests to keep in the interactions cache"),
flagSet.IntVar(&options.InteractionsEviction, "interactions-eviction", 60, "number of seconds to wait before evicting requests from cache"),
flagSet.IntVar(&options.InteractionsPollDuration, "interactions-poll-duration", 5, "number of seconds to wait before each interaction poll request"),
flagSet.IntVar(&options.InteractionsColldownPeriod, "interactions-cooldown-period", 5, "extra time for interaction polling before exiting"),
flagSet.IntVar(&options.InteractionsCooldownPeriod, "interactions-cooldown-period", 5, "extra time for interaction polling before exiting"),
flagSet.BoolVarP(&options.NoInteractsh, "no-interactsh", "ni", false, "disable interactsh server for OAST testing, exclude OAST based templates"),
)
@ -110,6 +119,8 @@ on extensive configurability, massive extensibility and ease of use.`)
flagSet.IntVarP(&options.RateLimitMinute, "rate-limit-minute", "rlm", 0, "maximum number of requests to send per minute"),
flagSet.IntVarP(&options.BulkSize, "bulk-size", "bs", 25, "maximum number of hosts to be analyzed in parallel per template"),
flagSet.IntVarP(&options.TemplateThreads, "concurrency", "c", 25, "maximum number of templates to be executed in parallel"),
flagSet.IntVarP(&options.HeadlessBulkSize, "headless-bulk-size", "hbs", 10, "maximum number of headless hosts to be analyzed in parallel per template"),
flagSet.IntVarP(&options.HeadlessTemplateThreads, "headless-concurrency", "hc", 10, "maximum number of headless templates to be executed in parallel"),
)
createGroup(flagSet, "optimization", "Optimizations",
@ -133,12 +144,9 @@ on extensive configurability, massive extensibility and ease of use.`)
flagSet.BoolVar(&options.Debug, "debug", false, "show all requests and responses"),
flagSet.BoolVar(&options.DebugRequests, "debug-req", false, "show all sent requests"),
flagSet.BoolVar(&options.DebugResponse, "debug-resp", false, "show all received responses"),
/* TODO why the separation? http://proxy:port vs socks5://proxy:port etc
TODO should auto-set the HTTP_PROXY variable for the process? */
flagSet.StringVarP(&options.ProxyURL, "proxy-url", "proxy", "", "URL of the HTTP proxy server"),
flagSet.StringVar(&options.ProxySocksURL, "proxy-socks-url", "", "URL of the SOCKS proxy server"),
flagSet.NormalizedStringSliceVarP(&options.Proxy, "proxy", "p", []string{}, "List of HTTP(s)/SOCKS5 proxy to use (comma separated or file input)"),
flagSet.StringVarP(&options.TraceLogFile, "trace-log", "tlog", "", "file to write sent requests trace log"),
flagSet.StringVarP(&options.ErrorLogFile, "error-log", "elog", "", "file to write sent requests error log"),
flagSet.BoolVar(&options.Version, "version", false, "show nuclei version"),
flagSet.BoolVarP(&options.Verbose, "verbose", "v", false, "show verbose output"),
flagSet.BoolVar(&options.VerboseVerbose, "vv", false, "display templates loaded for scan"),
@ -175,10 +183,3 @@ func createGroup(flagSet *goflags.FlagSet, groupName, description string, flags
currentFlag.Group(groupName)
}
}
/*
HacktoberFest update: Below, you can find our ticket recommendations. Tasks with the "good first issue" label are suitable for first time contributors. If you have other ideas, or need help with getting started, join our Discord channel or reach out to @forgedhallpass.
https://github.com/issues?q=is%3Aopen+is%3Aissue+user%3Aprojectdiscovery+label%3AHacktoberfest
*/

View File

@ -5,65 +5,62 @@ go 1.17
require (
github.com/Ice3man543/nvd v1.0.8
github.com/Knetic/govaluate v3.0.1-0.20171022003610-9aa49832a739+incompatible
github.com/akrylysov/pogreb v0.10.1 // indirect
github.com/alecthomas/jsonschema v0.0.0-20210818095345-1014919a589c
github.com/alecthomas/jsonschema v0.0.0-20211022214203-8b29eab41725
github.com/andygrunwald/go-jira v1.14.0
github.com/antchfx/htmlquery v1.2.3
github.com/antchfx/htmlquery v1.2.4
github.com/apex/log v1.9.0
github.com/blang/semver v3.5.1+incompatible
github.com/bluele/gcache v0.0.2
github.com/c4milo/unpackit v0.1.0 // indirect
github.com/corpix/uarand v0.1.1
github.com/go-rod/rod v0.101.7
github.com/go-playground/validator/v10 v10.9.0
github.com/go-rod/rod v0.101.8
github.com/gobwas/ws v1.1.0
github.com/google/go-github v17.0.0+incompatible
github.com/gosuri/uilive v0.0.4 // indirect
github.com/gosuri/uiprogress v0.0.1 // indirect
github.com/itchyny/gojq v0.12.4
github.com/itchyny/gojq v0.12.5
github.com/json-iterator/go v1.1.12
github.com/julienschmidt/httprouter v1.3.0
github.com/karlseguin/ccache v2.0.3+incompatible
github.com/karrick/godirwalk v1.16.1
github.com/logrusorgru/aurora v2.0.3+incompatible
github.com/mattn/go-runewidth v0.0.13 // indirect
github.com/miekg/dns v1.1.43
github.com/olekukonko/tablewriter v0.0.5
github.com/owenrumney/go-sarif v1.0.11
github.com/pkg/errors v0.9.1
github.com/projectdiscovery/clistats v0.0.8
github.com/projectdiscovery/fastdialer v0.0.13-0.20210917073912-cad93d88e69e
github.com/projectdiscovery/cryptoutil v0.0.0-20210805184155-b5d2512f9345
github.com/projectdiscovery/fastdialer v0.0.13
github.com/projectdiscovery/filekv v0.0.0-20210915124239-3467ef45dd08
github.com/projectdiscovery/fileutil v0.0.0-20210928100737-cab279c5d4b5
github.com/projectdiscovery/goflags v0.0.8-0.20211007103353-9b9229e8a240
github.com/projectdiscovery/goflags v0.0.8-0.20211028121123-edf02bc05b1a
github.com/projectdiscovery/gologger v1.1.4
github.com/projectdiscovery/hmap v0.0.2-0.20210917080408-0fd7bd286bfa
github.com/projectdiscovery/interactsh v0.0.6
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20210914222811-0a072d262f77
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20211006155443-c0a8d610a4df
github.com/projectdiscovery/rawhttp v0.0.7
github.com/projectdiscovery/retryabledns v1.0.13-0.20210916165024-76c5b76fd59a
github.com/projectdiscovery/retryabledns v1.0.13-0.20211109182249-43d38df59660
github.com/projectdiscovery/retryablehttp-go v1.0.2
github.com/projectdiscovery/stringsutil v0.0.0-20211013053023-e7b2e104d80d
github.com/projectdiscovery/stringsutil v0.0.0-20210830151154-f567170afdd9
github.com/projectdiscovery/yamldoc-go v1.0.2
github.com/remeh/sizedwaitgroup v1.0.0
github.com/rs/xid v1.3.0
github.com/segmentio/ksuid v1.0.4
github.com/shirou/gopsutil/v3 v3.21.7
github.com/shirou/gopsutil/v3 v3.21.9
github.com/spaolacci/murmur3 v1.1.0
github.com/spf13/cast v1.4.1
github.com/stretchr/testify v1.7.0
github.com/syndtr/goleveldb v1.0.0
github.com/tj/go-update v2.2.5-0.20200519121640-62b4b798fd68+incompatible
github.com/valyala/fasttemplate v1.2.1
github.com/xanzy/go-gitlab v0.50.3
github.com/weppos/publicsuffix-go v0.15.1-0.20210928183822-5ee35905bd95
github.com/xanzy/go-gitlab v0.51.1
github.com/ysmood/gson v0.6.4 // indirect
github.com/ysmood/leakless v0.7.0 // indirect
go.uber.org/atomic v1.9.0
go.uber.org/multierr v1.7.0
go.uber.org/ratelimit v0.2.0
golang.org/x/net v0.0.0-20210916014120-12bc252f5db8
golang.org/x/oauth2 v0.0.0-20210817223510-7df4dd6e12ab
golang.org/x/sys v0.0.0-20210915083310-ed5796bab164 // indirect
golang.org/x/net v0.0.0-20211020060615-d418f374d309
golang.org/x/oauth2 v0.0.0-20211005180243-6b3c2da341f1
golang.org/x/text v0.3.7
google.golang.org/appengine v1.6.7 // indirect
gopkg.in/yaml.v2 v2.4.0
moul.io/http2curl v1.0.0
)
@ -72,12 +69,13 @@ require (
git.mills.io/prologic/smtpd v0.0.0-20210710122116-a525b76c287a // indirect
github.com/PuerkitoBio/goquery v1.6.0 // indirect
github.com/StackExchange/wmi v1.2.1 // indirect
github.com/akrylysov/pogreb v0.10.1 // indirect
github.com/andres-erbsen/clock v0.0.0-20160526145045-9e14626cd129 // indirect
github.com/andybalholm/cascadia v1.1.0 // indirect
github.com/antchfx/xpath v1.1.6 // indirect
github.com/aymerick/douceur v0.2.0 // indirect
github.com/antchfx/xpath v1.2.0 // indirect
github.com/bits-and-blooms/bitset v1.2.0 // indirect
github.com/bits-and-blooms/bloom/v3 v3.0.1 // indirect
github.com/c4milo/unpackit v0.1.0 // indirect
github.com/cnf/structhash v0.0.0-20201127153200-e1b16c1ebc08 // indirect
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/dimchansky/utfbom v1.1.1 // indirect
@ -85,13 +83,18 @@ require (
github.com/eggsampler/acme/v3 v3.2.1 // indirect
github.com/fatih/structs v1.1.0 // indirect
github.com/go-ole/go-ole v1.2.5 // indirect
github.com/go-playground/locales v0.14.0 // indirect
github.com/go-playground/universal-translator v0.18.0 // indirect
github.com/gobwas/httphead v0.1.0 // indirect
github.com/gobwas/pool v0.2.1 // indirect
github.com/golang-jwt/jwt v3.2.1+incompatible // indirect
github.com/golang/groupcache v0.0.0-20200121045136-8c9f03a8e57e // indirect
github.com/golang/protobuf v1.5.2 // indirect
github.com/golang/snappy v0.0.4 // indirect
github.com/google/go-querystring v1.0.0 // indirect
github.com/google/uuid v1.3.0 // indirect
github.com/gorilla/css v1.0.0 // indirect
github.com/gosuri/uilive v0.0.4 // indirect
github.com/gosuri/uiprogress v0.0.1 // indirect
github.com/hashicorp/go-cleanhttp v0.5.1 // indirect
github.com/hashicorp/go-retryablehttp v0.6.8 // indirect
github.com/iancoleman/orderedmap v0.0.0-20190318233801-ac98e3ecb4b0 // indirect
@ -100,20 +103,19 @@ require (
github.com/karlseguin/ccache/v2 v2.0.8 // indirect
github.com/klauspost/compress v1.13.6 // indirect
github.com/klauspost/pgzip v1.2.5 // indirect
github.com/leodido/go-urn v1.2.1 // indirect
github.com/mattn/go-isatty v0.0.13 // indirect
github.com/microcosm-cc/bluemonday v1.0.15 // indirect
github.com/mattn/go-runewidth v0.0.13 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.2 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
github.com/projectdiscovery/blackrock v0.0.0-20210415162320-b38689ae3a2e // indirect
github.com/projectdiscovery/cryptoutil v0.0.0-20210805184155-b5d2512f9345 // indirect
github.com/projectdiscovery/iputil v0.0.0-20210804143329-3a30fcde43f3 // indirect
github.com/projectdiscovery/mapcidr v0.0.8 // indirect
github.com/projectdiscovery/networkpolicy v0.0.1 // indirect
github.com/rivo/uniseg v0.2.0 // indirect
github.com/saintfish/chardet v0.0.0-20120816061221-3af4cd4741ca // indirect
github.com/tklauser/go-sysconf v0.3.7 // indirect
github.com/tklauser/numcpus v0.2.3 // indirect
github.com/tklauser/go-sysconf v0.3.9 // indirect
github.com/tklauser/numcpus v0.3.0 // indirect
github.com/trivago/tgo v1.0.7 // indirect
github.com/ulikunitz/xz v0.5.10 // indirect
github.com/valyala/bytebufferpool v1.0.0 // indirect
@ -121,7 +123,10 @@ require (
github.com/ysmood/goob v0.3.0 // indirect
github.com/zclconf/go-cty v1.8.4 // indirect
go.etcd.io/bbolt v1.3.6 // indirect
golang.org/x/crypto v0.0.0-20210711020723-a769d52b0f97 // indirect
golang.org/x/sys v0.0.0-20210915083310-ed5796bab164 // indirect
golang.org/x/time v0.0.0-20191024005414-555d28b269f0 // indirect
google.golang.org/appengine v1.6.7 // indirect
google.golang.org/protobuf v1.27.1 // indirect
gopkg.in/corvus-ch/zbase32.v1 v1.0.0 // indirect
gopkg.in/yaml.v3 v3.0.0-20210107192922-496545a6307b // indirect

View File

@ -67,8 +67,9 @@ github.com/ajg/form v1.5.1/go.mod h1:uL1WgH+h2mgNtvBq0339dVnzXdBETtL2LeUXaIv25UY
github.com/akrylysov/pogreb v0.10.0/go.mod h1:pNs6QmpQ1UlTJKDezuRWmaqkgUE2TuU0YTWyqJZ7+lI=
github.com/akrylysov/pogreb v0.10.1 h1:FqlR8VR7uCbJdfUob916tPM+idpKgeESDXOA1K0DK4w=
github.com/akrylysov/pogreb v0.10.1/go.mod h1:pNs6QmpQ1UlTJKDezuRWmaqkgUE2TuU0YTWyqJZ7+lI=
github.com/alecthomas/jsonschema v0.0.0-20210818095345-1014919a589c h1:oJsq4z4xKgZWWOhrSZuLZ5KyYfRFytddLL1E5+psfIY=
github.com/alecthomas/jsonschema v0.0.0-20210818095345-1014919a589c/go.mod h1:/n6+1/DWPltRLWL/VKyUxg6tzsl5kHUCcraimt4vr60=
github.com/alecthomas/jsonschema v0.0.0-20211022214203-8b29eab41725 h1:NjwIgLQlD46o79bheVG4SCdRnnOz4XtgUN1WABX5DLA=
github.com/alecthomas/jsonschema v0.0.0-20211022214203-8b29eab41725/go.mod h1:/n6+1/DWPltRLWL/VKyUxg6tzsl5kHUCcraimt4vr60=
github.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
github.com/alecthomas/template v0.0.0-20190718012654-fb15b899a751/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=
github.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=
@ -79,10 +80,12 @@ github.com/andybalholm/cascadia v1.1.0 h1:BuuO6sSfQNFRu1LppgbD25Hr2vLYW25JvxHs5z
github.com/andybalholm/cascadia v1.1.0/go.mod h1:GsXiBklL0woXo1j/WYWtSYYC4ouU9PqHO0sqidkEA4Y=
github.com/andygrunwald/go-jira v1.14.0 h1:7GT/3qhar2dGJ0kq8w0d63liNyHOnxZsUZ9Pe4+AKBI=
github.com/andygrunwald/go-jira v1.14.0/go.mod h1:KMo2f4DgMZA1C9FdImuLc04x4WQhn5derQpnsuBFgqE=
github.com/antchfx/htmlquery v1.2.3 h1:sP3NFDneHx2stfNXCKbhHFo8XgNjCACnU/4AO5gWz6M=
github.com/antchfx/htmlquery v1.2.3/go.mod h1:B0ABL+F5irhhMWg54ymEZinzMSi0Kt3I2if0BLYa3V0=
github.com/antchfx/xpath v1.1.6 h1:6sVh6hB5T6phw1pFpHRQ+C4bd8sNI+O58flqtg7h0R0=
github.com/antchfx/htmlquery v1.2.4 h1:qLteofCMe/KGovBI6SQgmou2QNyedFUW+pE+BpeZ494=
github.com/antchfx/htmlquery v1.2.4/go.mod h1:2xO6iu3EVWs7R2JYqBbp8YzG50gj/ofqs5/0VZoDZLc=
github.com/antchfx/xpath v1.1.6/go.mod h1:Yee4kTMuNiPYJ7nSNorELQMr1J33uOpXDMByNYhvtNk=
github.com/antchfx/xpath v1.2.0 h1:mbwv7co+x0RwgeGAOHdrKy89GvHaGvxxBtPK0uF9Zr8=
github.com/antchfx/xpath v1.2.0/go.mod h1:i54GszH55fYfBmoZXapTHN8T8tkcHfRgLyVwwqzXNcs=
github.com/apache/thrift v0.12.0/go.mod h1:cp2SuWMxlEZw2r+iP2GNCdIi4C1qmUzdZFSVb+bacwQ=
github.com/apache/thrift v0.13.0/go.mod h1:cp2SuWMxlEZw2r+iP2GNCdIi4C1qmUzdZFSVb+bacwQ=
github.com/apex/log v1.9.0 h1:FHtw/xuaM8AgmvDDTI9fiwoAL25Sq2cxojnZICUU8l0=
@ -96,14 +99,11 @@ github.com/armon/consul-api v0.0.0-20180202201655-eb2c6b5be1b6/go.mod h1:grANhF5
github.com/armon/go-metrics v0.0.0-20180917152333-f0300d1749da/go.mod h1:Q73ZrmVTwzkszR9V5SSuryQ31EELlFMUz1kKyl939pY=
github.com/armon/go-radix v0.0.0-20180808171621-7fddfc383310/go.mod h1:ufUuZ+zHj4x4TnLV4JWEpy2hxWSpsRywHrMgIH9cCH8=
github.com/aryann/difflib v0.0.0-20170710044230-e206f873d14a/go.mod h1:DAHtR1m6lCRdSC2Tm3DSWRPvIPr6xNKyeHdqDQSQT+A=
github.com/asaskevich/govalidator v0.0.0-20210307081110-f21760c49a8d/go.mod h1:WaHUgvxTVq04UNunO+XhnAqY/wQc+bxr74GqbsZ/Jqw=
github.com/aws/aws-lambda-go v1.13.3/go.mod h1:4UKl9IzQMoD+QF79YdCuzCwp8VbmG4VAQwij/eHl5CU=
github.com/aws/aws-sdk-go v1.20.6/go.mod h1:KmX6BPdI08NWTb3/sm4ZGu5ShLoqVDhKgpiN924inxo=
github.com/aws/aws-sdk-go v1.27.0/go.mod h1:KmX6BPdI08NWTb3/sm4ZGu5ShLoqVDhKgpiN924inxo=
github.com/aws/aws-sdk-go-v2 v0.18.0/go.mod h1:JWVYvqSMppoMJC0x5wdwiImzgXTI9FuZwxzkQq9wy+g=
github.com/aybabtme/rgbterm v0.0.0-20170906152045-cc83f3b3ce59/go.mod h1:q/89r3U2H7sSsE2t6Kca0lfwTK8JdoNGS/yzM/4iH5I=
github.com/aymerick/douceur v0.2.0 h1:Mv+mAeH1Q+n9Fr+oyamOlAkUNPWPlA8PPGR0QAaYuPk=
github.com/aymerick/douceur v0.2.0/go.mod h1:wlT5vV2O3h55X9m7iVYN0TBM0NH/MmbLnd30/FjWUq4=
github.com/aymerick/raymond v2.0.3-0.20180322193309-b565731e1464+incompatible/go.mod h1:osfaiScAUVup+UC9Nfq76eWqDhXlp+4UYaA8uhTBO6g=
github.com/beorn7/perks v0.0.0-20180321164747-3a771d992973/go.mod h1:Dwedo/Wpr24TaqPxmxbtue+5NUziq4I4S80YR8gNf3Q=
github.com/beorn7/perks v1.0.0/go.mod h1:KWe93zE9D1o94FZ5RNwFwVgaQK1VOXiVxmqh+CedLV8=
@ -230,16 +230,30 @@ github.com/go-logr/logr v0.4.0/go.mod h1:z6/tIYblkpsD+a4lm/fGIIU9mZ+XfAiaFtq7xTg
github.com/go-martini/martini v0.0.0-20170121215854-22fa46961aab/go.mod h1:/P9AEU963A2AYjv4d1V5eVL1CQbEJq6aCNHDDjibzu8=
github.com/go-ole/go-ole v1.2.5 h1:t4MGB5xEDZvXI+0rMjjsfBsD7yAgp/s9ZDkL1JndXwY=
github.com/go-ole/go-ole v1.2.5/go.mod h1:pprOEPIfldk/42T2oK7lQ4v4JSDwmV0As9GaiUsvbm0=
github.com/go-playground/assert/v2 v2.0.1 h1:MsBgLAaY856+nPRTKrp3/OZK38U/wa0CcBYNjji3q3A=
github.com/go-playground/assert/v2 v2.0.1/go.mod h1:VDjEfimB/XKnb+ZQfWdccd7VUvScMdVu0Titje2rxJ4=
github.com/go-playground/locales v0.14.0 h1:u50s323jtVGugKlcYeyzC0etD1HifMjqmJqb8WugfUU=
github.com/go-playground/locales v0.14.0/go.mod h1:sawfccIbzZTqEDETgFXqTho0QybSa7l++s0DH+LDiLs=
github.com/go-playground/universal-translator v0.18.0 h1:82dyy6p4OuJq4/CByFNOn/jYrnRPArHwAcmLoJZxyho=
github.com/go-playground/universal-translator v0.18.0/go.mod h1:UvRDBj+xPUEGrFYl+lu/H90nyDXpg0fqeB/AQUGNTVA=
github.com/go-playground/validator/v10 v10.9.0 h1:NgTtmN58D0m8+UuxtYmGztBJB7VnPgjj221I1QHci2A=
github.com/go-playground/validator/v10 v10.9.0/go.mod h1:74x4gJWsvQexRdW8Pn3dXSGrTK4nAUsbPlLADvpJkos=
github.com/go-redis/redis v6.15.5+incompatible/go.mod h1:NAIEuMOZ/fxfXJIrKDQDz8wamY7mA7PouImQ2Jvg6kA=
github.com/go-rod/rod v0.91.1/go.mod h1:/W4lcZiCALPD603MnJGIvhtywP3R6yRB9EDfFfsHiiI=
github.com/go-rod/rod v0.101.7 h1:kbI5CNvcRhf7feybBln4xDutsM0mbsF0ENNZfKcF6WA=
github.com/go-rod/rod v0.101.7/go.mod h1:N/zlT53CfSpq74nb6rOR0K8UF0SPUPBmzBnArrms+mY=
github.com/go-rod/rod v0.101.8 h1:oV0O97uwjkCVyAP0hD6K6bBE8FUMIjs0dtF7l6kEBsU=
github.com/go-rod/rod v0.101.8/go.mod h1:N/zlT53CfSpq74nb6rOR0K8UF0SPUPBmzBnArrms+mY=
github.com/go-sql-driver/mysql v1.4.0/go.mod h1:zAC/RDZ24gD3HViQzih4MyKcchzm+sOG5ZlKdlhCg5w=
github.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=
github.com/go-task/slim-sprig v0.0.0-20210107165309-348f09dbbbc0/go.mod h1:fyg7847qk6SyHyPtNmDHnmrv/HOrqktSC+C9fM+CJOE=
github.com/gobwas/httphead v0.0.0-20180130184737-2c6c146eadee/go.mod h1:L0fX3K22YWvt/FAX9NnzrNzcI4wNYi9Yku4O0LKYflo=
github.com/gobwas/httphead v0.1.0 h1:exrUm0f4YX0L7EBwZHuCF4GDp8aJfVeBrlLQrs6NqWU=
github.com/gobwas/httphead v0.1.0/go.mod h1:O/RXo79gxV8G+RqlR/otEwx4Q36zl9rqC5u12GKvMCM=
github.com/gobwas/pool v0.2.0/go.mod h1:q8bcK0KcYlCgd9e7WYLm9LpyS+YeLd8JVDW6WezmKEw=
github.com/gobwas/pool v0.2.1 h1:xfeeEhW7pwmX8nuLVlqbzVc7udMDrwetjEv+TZIz1og=
github.com/gobwas/pool v0.2.1/go.mod h1:q8bcK0KcYlCgd9e7WYLm9LpyS+YeLd8JVDW6WezmKEw=
github.com/gobwas/ws v1.0.2/go.mod h1:szmBTxLgaFppYjEmNtny/v3w89xOydFnnZMcgRRu/EM=
github.com/gobwas/ws v1.1.0 h1:7RFti/xnNkMJnrK7D1yQ/iCIB5OrrY/54/H930kIbHA=
github.com/gobwas/ws v1.1.0/go.mod h1:nzvNcVha5eUziGrbxFCo6qFIojQHjJV5cLYIbezhfL0=
github.com/gogo/googleapis v0.0.0-20180223154316-0cd9801be74a/go.mod h1:gf4bu3Q80BeJ6H1S1vYPm8/ELATdvryBaNFGgqEef3s=
github.com/gogo/googleapis v1.1.0/go.mod h1:gf4bu3Q80BeJ6H1S1vYPm8/ELATdvryBaNFGgqEef3s=
github.com/gogo/googleapis v1.4.1/go.mod h1:2lpHqI5OcWCtVElxXnPt+s8oJvMpySlOyM6xDCrzib4=
@ -328,8 +342,6 @@ github.com/googleapis/gax-go/v2 v2.0.5/go.mod h1:DWXyrwAJ9X0FpwwEdw+IPEYBICEFu5m
github.com/gopherjs/gopherjs v0.0.0-20181017120253-0766667cb4d1 h1:EGx4pi6eqNxGaHF6qqu48+N2wcFQ5qg5FXgOdqsJ5d8=
github.com/gopherjs/gopherjs v0.0.0-20181017120253-0766667cb4d1/go.mod h1:wJfORRmW1u3UXTncJ5qlYoELFm8eSnnEO6hX4iZ3EWY=
github.com/gorilla/context v1.1.1/go.mod h1:kBGZzfjB9CEq2AlWe17Uuf7NDRt0dE0s8S51q0aT7Yg=
github.com/gorilla/css v1.0.0 h1:BQqNyPTi50JCFMTw/b67hByjMVXZRwGha6wxVGkeihY=
github.com/gorilla/css v1.0.0/go.mod h1:Dn721qIggHpt4+EFCcTLTU/vk5ySda2ReITrtgBl60c=
github.com/gorilla/mux v1.6.2/go.mod h1:1lud6UwP+6orDFRuTfBEV8e9/aOM/c4fVVCaMa2zaAs=
github.com/gorilla/mux v1.7.3/go.mod h1:1lud6UwP+6orDFRuTfBEV8e9/aOM/c4fVVCaMa2zaAs=
github.com/gorilla/websocket v0.0.0-20170926233335-4201258b820c/go.mod h1:E7qHFY5m1UJ88s3WnNqhKjPHQ0heANvMoAMk2YaljkQ=
@ -385,8 +397,9 @@ github.com/iris-contrib/go.uuid v2.0.0+incompatible/go.mod h1:iz2lgM/1UnEf1kP0L/
github.com/iris-contrib/i18n v0.0.0-20171121225848-987a633949d0/go.mod h1:pMCz62A0xJL6I+umB2YTlFRwWXaDFA0jy+5HzGiJjqI=
github.com/iris-contrib/schema v0.0.1/go.mod h1:urYA3uvUNG1TIIjOSCzHr9/LmbQo8LrOcOqfqxa4hXw=
github.com/itchyny/go-flags v1.5.0/go.mod h1:lenkYuCobuxLBAd/HGFE4LRoW8D3B6iXRQfWYJ+MNbA=
github.com/itchyny/gojq v0.12.4 h1:8zgOZWMejEWCLjbF/1mWY7hY7QEARm7dtuhC6Bp4R8o=
github.com/itchyny/gojq v0.12.4/go.mod h1:EQUSKgW/YaOxmXpAwGiowFDO4i2Rmtk5+9dFyeiymAg=
github.com/itchyny/gojq v0.12.5 h1:6SJ1BQ1VAwJAlIvLSIZmqHP/RUEq3qfVWvsRxrqhsD0=
github.com/itchyny/gojq v0.12.5/go.mod h1:3e1hZXv+Kwvdp6V9HXpVrvddiHVApi5EDZwS+zLFeiE=
github.com/itchyny/timefmt-go v0.1.3 h1:7M3LGVDsqcd0VZH2U+x393obrzZisp7C0uEe921iRkU=
github.com/itchyny/timefmt-go v0.1.3/go.mod h1:0osSSCQSASBJMsIZnhAaF1C2fCBTJZXrnj37mG8/c+A=
github.com/jasonlvhit/gocron v0.0.1 h1:qTt5qF3b3srDjeOIR4Le1LfeyvoYzJlYpqvG7tJX5YU=
@ -451,6 +464,8 @@ github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/labstack/echo/v4 v4.1.11/go.mod h1:i541M3Fj6f76NZtHSj7TXnyM8n2gaodfvfxNnFqi74g=
github.com/labstack/gommon v0.3.0/go.mod h1:MULnywXg0yavhxWKc+lOruYdAhDwPK9wf0OL7NoOu+k=
github.com/leodido/go-urn v1.2.1 h1:BqpAaACuzVSgi/VLzGZIobT2z4v53pjosyNd9Yv6n/w=
github.com/leodido/go-urn v1.2.1/go.mod h1:zt4jvISO2HfUBqxjfIshjdMTYS56ZS/qv49ictyFfxY=
github.com/lightstep/lightstep-tracer-common/golang/gogo v0.0.0-20190605223551-bc2310a04743/go.mod h1:qklhhLq1aX+mtWk9cPHPzaBjWImj5ULL6C7HFJtXQMM=
github.com/lightstep/lightstep-tracer-go v0.18.1/go.mod h1:jlF1pusYV4pidLvZ+XD0UBX0ZE6WURAspgAczcDHrL4=
github.com/logrusorgru/aurora v0.0.0-20200102142835-e9ef32dff381/go.mod h1:7rIyQOR62GCctdiQpZ/zOJlFyk6y+94wXzv6RNZgaR4=
@ -479,8 +494,6 @@ github.com/mediocregopher/mediocre-go-lib v0.0.0-20181029021733-cb65787f37ed/go.
github.com/mediocregopher/radix/v3 v3.3.0/go.mod h1:EmfVyvspXz1uZEyPBMyGK+kjWiKQGvsUt6O3Pj+LDCQ=
github.com/mgutz/ansi v0.0.0-20170206155736-9520e82c474b/go.mod h1:01TrycV0kFyexm33Z7vhZRXopbI8J3TDReVlkTgMUxE=
github.com/microcosm-cc/bluemonday v1.0.2/go.mod h1:iVP4YcDBq+n/5fb23BhYFvIMq/leAFZyRl6bYmGDlGc=
github.com/microcosm-cc/bluemonday v1.0.15 h1:J4uN+qPng9rvkBZBoBb8YGR+ijuklIMpSOZZLjYpbeY=
github.com/microcosm-cc/bluemonday v1.0.15/go.mod h1:ZLvAzeakRwrGnzQEvstVzVt3ZpqOF2+sdFr0Om+ce30=
github.com/miekg/dns v1.0.14/go.mod h1:W1PPwlIAgtquWBMBEV9nkV9Cazfe8ScdGz/Lj7v3Nrg=
github.com/miekg/dns v1.1.29/go.mod h1:KNUDUusw/aVsxyTYZM1oqvCicbwhgbNgztCETuNZ7xM=
github.com/miekg/dns v1.1.41/go.mod h1:p6aan82bvRIyn+zDIv9xYNUpwa73JcSh9BKwknJysuI=
@ -573,17 +586,18 @@ github.com/projectdiscovery/cryptoutil v0.0.0-20210805184155-b5d2512f9345 h1:jT6
github.com/projectdiscovery/cryptoutil v0.0.0-20210805184155-b5d2512f9345/go.mod h1:clhQmPnt35ziJW1AhJRKyu8aygXCSoyWj6dtmZBRjjc=
github.com/projectdiscovery/fastdialer v0.0.12/go.mod h1:RkRbxqDCcCFhfNUbkzBIz/ieD4uda2JuUA4WJ+RLee0=
github.com/projectdiscovery/fastdialer v0.0.13-0.20210824195254-0113c1406542/go.mod h1:TuapmLiqtunJOxpM7g0tpTy/TUF/0S+XFyx0B0Wx0DQ=
github.com/projectdiscovery/fastdialer v0.0.13-0.20210917073912-cad93d88e69e h1:xMAFYJgRxopAwKrj7HDwMBKJGCGDbHqopS8f959xges=
github.com/projectdiscovery/fastdialer v0.0.13-0.20210917073912-cad93d88e69e/go.mod h1:O1l6+vAQy1QRo9FqyuyJ57W3CwpIXXg7oGo14Le6ZYQ=
github.com/projectdiscovery/fastdialer v0.0.13 h1:BCe7JsFxRk1kAUQcy4X+9lqEuT7Y6LRSlHXfia03XOo=
github.com/projectdiscovery/fastdialer v0.0.13/go.mod h1:Mex24omi3RxrmhA8Ote7rw+6LWMiaBvbJq8CNp0ksII=
github.com/projectdiscovery/filekv v0.0.0-20210915124239-3467ef45dd08 h1:NwD1R/du1dqrRKN3SJl9kT6tN3K9puuWFXEvYF2ihew=
github.com/projectdiscovery/filekv v0.0.0-20210915124239-3467ef45dd08/go.mod h1:paLCnwV8sL7ppqIwVQodQrk3F6mnWafwTDwRd7ywZwQ=
github.com/projectdiscovery/fileutil v0.0.0-20210804142714-ebba15fa53ca/go.mod h1:U+QCpQnX8o2N2w0VUGyAzjM3yBAe4BKedVElxiImsx0=
github.com/projectdiscovery/fileutil v0.0.0-20210914153648-31f843feaad4/go.mod h1:U+QCpQnX8o2N2w0VUGyAzjM3yBAe4BKedVElxiImsx0=
github.com/projectdiscovery/fileutil v0.0.0-20210926202739-6050d0acf73c/go.mod h1:U+QCpQnX8o2N2w0VUGyAzjM3yBAe4BKedVElxiImsx0=
github.com/projectdiscovery/fileutil v0.0.0-20210928100737-cab279c5d4b5 h1:2dbm7UhrAKnccZttr78CAmG768sSCd+MBn4ayLVDeqA=
github.com/projectdiscovery/fileutil v0.0.0-20210928100737-cab279c5d4b5/go.mod h1:U+QCpQnX8o2N2w0VUGyAzjM3yBAe4BKedVElxiImsx0=
github.com/projectdiscovery/goflags v0.0.7/go.mod h1:Jjwsf4eEBPXDSQI2Y+6fd3dBumJv/J1U0nmpM+hy2YY=
github.com/projectdiscovery/goflags v0.0.8-0.20211007103353-9b9229e8a240 h1:b7zDUSsgN5f4/IlhKF6RVGsp/NkHIuty0o1YjzAMKUs=
github.com/projectdiscovery/goflags v0.0.8-0.20211007103353-9b9229e8a240/go.mod h1:Jjwsf4eEBPXDSQI2Y+6fd3dBumJv/J1U0nmpM+hy2YY=
github.com/projectdiscovery/goflags v0.0.8-0.20211028121123-edf02bc05b1a h1:EzwVm8i4zmzqZX55vrDtyfogwHh8AAZ3cWCJe4fEduk=
github.com/projectdiscovery/goflags v0.0.8-0.20211028121123-edf02bc05b1a/go.mod h1:Jjwsf4eEBPXDSQI2Y+6fd3dBumJv/J1U0nmpM+hy2YY=
github.com/projectdiscovery/gologger v1.0.1/go.mod h1:Ok+axMqK53bWNwDSU1nTNwITLYMXMdZtRc8/y1c7sWE=
github.com/projectdiscovery/gologger v1.1.4 h1:qWxGUq7ukHWT849uGPkagPKF3yBPYAsTtMKunQ8O2VI=
github.com/projectdiscovery/gologger v1.1.4/go.mod h1:Bhb6Bdx2PV1nMaFLoXNBmHIU85iROS9y1tBuv7T5pMY=
@ -591,7 +605,6 @@ github.com/projectdiscovery/hmap v0.0.1/go.mod h1:VDEfgzkKQdq7iGTKz8Ooul0NuYHQ8q
github.com/projectdiscovery/hmap v0.0.2-0.20210616215655-7b78e7f33d1f/go.mod h1:FH+MS/WNKTXJQtdRn+/Zg5WlKCiMN0Z1QUedUIuM5n8=
github.com/projectdiscovery/hmap v0.0.2-0.20210727180307-d63d35146e97/go.mod h1:FH+MS/WNKTXJQtdRn+/Zg5WlKCiMN0Z1QUedUIuM5n8=
github.com/projectdiscovery/hmap v0.0.2-0.20210825180603-fca7166c158f/go.mod h1:RLM8b1z2HEq74u5AXN1Lbvfq+1BZWpnTQJcwLnMLA54=
github.com/projectdiscovery/hmap v0.0.2-0.20210917073634-bfb0e9c03800/go.mod h1:FH+MS/WNKTXJQtdRn+/Zg5WlKCiMN0Z1QUedUIuM5n8=
github.com/projectdiscovery/hmap v0.0.2-0.20210917080408-0fd7bd286bfa h1:9sZWFUAshIa/ea0RKjGRuuZiS5PzYXAFjTRUnSbezr0=
github.com/projectdiscovery/hmap v0.0.2-0.20210917080408-0fd7bd286bfa/go.mod h1:lV5f/PNPmCCjCN/dR317/chN9s7VG5h/xcbFfXOz8Fo=
github.com/projectdiscovery/interactsh v0.0.4/go.mod h1:PtJrddeBW1/LeOVgTvvnjUl3Hu/17jTkoIi8rXeEODE=
@ -609,23 +622,23 @@ github.com/projectdiscovery/mapcidr v0.0.8 h1:16U05F2x3o/jSTsxSCY2hCuCs9xOSwVxjo
github.com/projectdiscovery/mapcidr v0.0.8/go.mod h1:7CzdUdjuLVI0s33dQ33lWgjg3vPuLFw2rQzZ0RxkT00=
github.com/projectdiscovery/networkpolicy v0.0.1 h1:RGRuPlxE8WLFF9tdKSjTsYiTIKHNHW20Kl0nGGiRb1I=
github.com/projectdiscovery/networkpolicy v0.0.1/go.mod h1:asvdg5wMy3LPVMGALatebKeOYH5n5fV5RCTv6DbxpIs=
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20210914222811-0a072d262f77 h1:SNtAiRRrJtDJJDroaa/bFXt/Tix2LA6+rHRib0ORlJQ=
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20210914222811-0a072d262f77/go.mod h1:pxWVDgq88t9dWv4+J2AIaWgY+EqOE1AyfHS0Tn23w4M=
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20211006155443-c0a8d610a4df h1:CvTNAUD5JbLMqpMFoGNgfk2gOcN0NC57ICu0+oK84vs=
github.com/projectdiscovery/nuclei-updatecheck-api v0.0.0-20211006155443-c0a8d610a4df/go.mod h1:pxWVDgq88t9dWv4+J2AIaWgY+EqOE1AyfHS0Tn23w4M=
github.com/projectdiscovery/nuclei/v2 v2.5.1/go.mod h1:sU2qcY0MQFS0CqP1BgkR8ZnUyFhqK0BdnY6bvTKNjXY=
github.com/projectdiscovery/rawhttp v0.0.7 h1:5m4peVgjbl7gqDcRYMTVEuX+Xs/nh76ohTkkvufucLg=
github.com/projectdiscovery/rawhttp v0.0.7/go.mod h1:PQERZAhAv7yxI/hR6hdDPgK1WTU56l204BweXrBec+0=
github.com/projectdiscovery/retryabledns v1.0.11/go.mod h1:4sMC8HZyF01HXukRleSQYwz4870bwgb4+hTSXTMrkf4=
github.com/projectdiscovery/retryabledns v1.0.12/go.mod h1:4sMC8HZyF01HXukRleSQYwz4870bwgb4+hTSXTMrkf4=
github.com/projectdiscovery/retryabledns v1.0.13-0.20210916165024-76c5b76fd59a h1:WJQjr9qi/VjWhdNiGyNqcFi0967Gp0W3I769bCpHOJE=
github.com/projectdiscovery/retryabledns v1.0.13-0.20210916165024-76c5b76fd59a/go.mod h1:tXaLDs4n3pRZHwfa8mdXpUWe/AYDNK3HlWDjldhRbjI=
github.com/projectdiscovery/retryabledns v1.0.13-0.20211109182249-43d38df59660 h1:Ooa5htghPkdyfpzy6Y5KLdyv4w8ePZWmfzFSPQlJStQ=
github.com/projectdiscovery/retryabledns v1.0.13-0.20211109182249-43d38df59660/go.mod h1:UfszkO3x+GLKVOpXB7boddJKbwNCr+tMPSkfgCSNhl4=
github.com/projectdiscovery/retryablehttp-go v1.0.1/go.mod h1:SrN6iLZilNG1X4neq1D+SBxoqfAF4nyzvmevkTkWsek=
github.com/projectdiscovery/retryablehttp-go v1.0.2 h1:LV1/KAQU+yeWhNVlvveaYFsjBYRwXlNEq0PvrezMV0U=
github.com/projectdiscovery/retryablehttp-go v1.0.2/go.mod h1:dx//aY9V247qHdsRf0vdWHTBZuBQ2vm6Dq5dagxrDYI=
github.com/projectdiscovery/stringsutil v0.0.0-20210804142656-fd3c28dbaafe/go.mod h1:oTRc18WBv9t6BpaN9XBY+QmG28PUpsyDzRht56Qf49I=
github.com/projectdiscovery/stringsutil v0.0.0-20210823090203-2f5f137e8e1d/go.mod h1:oTRc18WBv9t6BpaN9XBY+QmG28PUpsyDzRht56Qf49I=
github.com/projectdiscovery/stringsutil v0.0.0-20210830151154-f567170afdd9 h1:xbL1/7h0k6HE3RzPdYk9W/8pUxESrGWewTaZdIB5Pes=
github.com/projectdiscovery/stringsutil v0.0.0-20210830151154-f567170afdd9/go.mod h1:oTRc18WBv9t6BpaN9XBY+QmG28PUpsyDzRht56Qf49I=
github.com/projectdiscovery/stringsutil v0.0.0-20211013053023-e7b2e104d80d h1:YBYwsm8MrSp9t7mLehyqGwUKZWB08fG+YRePQRo5iFw=
github.com/projectdiscovery/stringsutil v0.0.0-20211013053023-e7b2e104d80d/go.mod h1:JK4F9ACNPgO+Lbm80khX2q1ABInBMbwIOmbsEE61Sn4=
github.com/projectdiscovery/yamldoc-go v1.0.2 h1:SKb7PHgSOXm27Zci05ba0FxpyQiu6bGEiVMEcjCK1rQ=
github.com/projectdiscovery/yamldoc-go v1.0.2/go.mod h1:7uSxfMXaBmzvw8m5EhOEjB6nhz0rK/H9sUjq1ciZu24=
github.com/prometheus/client_golang v0.9.1/go.mod h1:7SWBe2y4D6OKWSNQJUaRYU/AaXPKyh/dDVn+NZz0KFw=
@ -664,8 +677,6 @@ github.com/russross/blackfriday v1.5.2/go.mod h1:JO/DiYxRf+HjHt06OyowR9PTA263kcR
github.com/russross/blackfriday/v2 v2.0.1/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
github.com/ryanuber/columnize v0.0.0-20160712163229-9b3edd62028f/go.mod h1:sm1tb6uqfes/u+d4ooFouqFdy9/2g9QGwK3SQygK0Ts=
github.com/ryanuber/columnize v2.1.0+incompatible/go.mod h1:sm1tb6uqfes/u+d4ooFouqFdy9/2g9QGwK3SQygK0Ts=
github.com/saintfish/chardet v0.0.0-20120816061221-3af4cd4741ca h1:NugYot0LIVPxTvN8n+Kvkn6TrbMyxQiuvKdEwFdR9vI=
github.com/saintfish/chardet v0.0.0-20120816061221-3af4cd4741ca/go.mod h1:uugorj2VCxiV1x+LzaIdVa9b4S4qGAcH6cbhh4qVxOU=
github.com/samuel/go-zookeeper v0.0.0-20190923202752-2cc03de413da/go.mod h1:gi+0XIa01GRL2eRQVjQkKGqKF3SF9vZR/HnPullcV2E=
github.com/sclevine/agouti v3.0.0+incompatible/go.mod h1:b4WX9W9L1sfQKXeJf1mUTLZKJ48R1S7H23Ji7oFO5Bw=
github.com/sean-/seed v0.0.0-20170313163322-e2103e2c3529/go.mod h1:DxrIzT+xaE7yg65j358z/aeFdxmN0P9QXhEzd20vsDc=
@ -674,8 +685,9 @@ github.com/segmentio/ksuid v1.0.4 h1:sBo2BdShXjmcugAMwjugoGUdUV0pcxY5mW4xKRn3v4c
github.com/segmentio/ksuid v1.0.4/go.mod h1:/XUiZBD3kVx5SmUOl55voK5yeAbBNNIed+2O73XgrPE=
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
github.com/sergi/go-diff v1.1.0/go.mod h1:STckp+ISIX8hZLjrqAeVduY0gWCT9IjLuqbuNXdaHfM=
github.com/shirou/gopsutil/v3 v3.21.7 h1:PnTqQamUjwEDSgn+nBGu0qSDV/CfvyiR/gwTH3i7HTU=
github.com/shirou/gopsutil/v3 v3.21.7/go.mod h1:RGl11Y7XMTQPmHh8F0ayC6haKNBgH4PXMJuTAcMOlz4=
github.com/shirou/gopsutil/v3 v3.21.9 h1:Vn4MUz2uXhqLSiCbGFRc0DILbMVLAY92DSkT8bsYrHg=
github.com/shirou/gopsutil/v3 v3.21.9/go.mod h1:YWp/H8Qs5fVmf17v7JNZzA0mPJ+mS2e9JdiUF9LlKzQ=
github.com/shurcooL/sanitized_anchor_name v1.0.0/go.mod h1:1NzhyTcUVG4SuEtjjoZeVRXNmyL/1OwPU0+IJeTBvfc=
github.com/sirupsen/logrus v1.2.0/go.mod h1:LxeOpSwHxABJmUn/MG1IvRgCAasNZTLOkJPxbbu5VWo=
github.com/sirupsen/logrus v1.4.2/go.mod h1:tLMulIdttU9McNUspp0xgXVQah82FyeX6MwdIuYE2rE=
@ -726,10 +738,12 @@ github.com/tj/go-kinesis v0.0.0-20171128231115-08b17f58cb1b/go.mod h1:/yhzCV0xPf
github.com/tj/go-spin v1.1.0/go.mod h1:Mg1mzmePZm4dva8Qz60H2lHwmJ2loum4VIrLgVnKwh4=
github.com/tj/go-update v2.2.5-0.20200519121640-62b4b798fd68+incompatible h1:guTq1YxwB8XSILkI9q4IrOmrCOS6Hc1L3hmOhi4Swcs=
github.com/tj/go-update v2.2.5-0.20200519121640-62b4b798fd68+incompatible/go.mod h1:waFwwyiAhGey2e+dNoYQ/iLhIcFqhCW7zL/+vDU1WLo=
github.com/tklauser/go-sysconf v0.3.7 h1:HT7h4+536gjqeq1ZIJPgOl1rg1XFatQGVZWp7Py53eg=
github.com/tklauser/go-sysconf v0.3.7/go.mod h1:JZIdXh4RmBvZDBZ41ld2bGxRV3n4daiiqA3skYhAoQ4=
github.com/tklauser/numcpus v0.2.3 h1:nQ0QYpiritP6ViFhrKYsiv6VVxOpum2Gks5GhnJbS/8=
github.com/tklauser/go-sysconf v0.3.9 h1:JeUVdAOWhhxVcU6Eqr/ATFHgXk/mmiItdKeJPev3vTo=
github.com/tklauser/go-sysconf v0.3.9/go.mod h1:11DU/5sG7UexIrp/O6g35hrWzu0JxlwQ3LSFUzyeuhs=
github.com/tklauser/numcpus v0.2.3/go.mod h1:vpEPS/JC+oZGGQ/My/vJnNsvMDQL6PwOqt8dsCw5j+E=
github.com/tklauser/numcpus v0.3.0 h1:ILuRUQBtssgnxw0XXIjKUC56fgnOrFoQQ/4+DeU2biQ=
github.com/tklauser/numcpus v0.3.0/go.mod h1:yFGUr7TUHQRAhyqBcEg0Ge34zDBAsIvJJcyE6boqnA8=
github.com/tmc/grpc-websocket-proxy v0.0.0-20170815181823-89b8d40f7ca8/go.mod h1:ncp9v5uamzpCO7NfCPTXjqaC+bZgJeR0sMTm6dMHP7U=
github.com/trivago/tgo v1.0.7 h1:uaWH/XIy9aWYWpjm2CU3RpcqZXmX2ysQ9/Go+d9gyrM=
github.com/trivago/tgo v1.0.7/go.mod h1:w4dpD+3tzNIIiIfkWWa85w5/B77tlvdZckQ+6PkFnhc=
@ -750,10 +764,13 @@ github.com/valyala/fasttemplate v1.2.1/go.mod h1:KHLXt3tVN2HBp8eijSv/kGJopbvo7S+
github.com/valyala/tcplisten v0.0.0-20161114210144-ceec8f93295a/go.mod h1:v3UYOV9WzVtRmSR+PDvWpU/qWl4Wa5LApYYX4ZtKbio=
github.com/vmihailenco/msgpack/v4 v4.3.12/go.mod h1:gborTTJjAo/GWTqqRjrLCn9pgNN+NXzzngzBKDPIqw4=
github.com/vmihailenco/tagparser v0.1.1/go.mod h1:OeAg3pn3UbLjkWt+rN9oFYB6u/cQgqMEUPoW2WPyhdI=
github.com/weppos/publicsuffix-go v0.15.1-0.20210928183822-5ee35905bd95 h1:DyAZOw3JsVd6LJHqhl4MpKQdYQEmat0C6pPPwom39Ow=
github.com/weppos/publicsuffix-go v0.15.1-0.20210928183822-5ee35905bd95/go.mod h1:HYux0V0Zi04bHNwOHy4cXJVz/TQjYonnF6aoYhj+3QE=
github.com/wsxiaoys/terminal v0.0.0-20160513160801-0940f3fc43a0 h1:3UeQBvD0TFrlVjOeLOBz+CPAI8dnbqNSVwUwRrkp7vQ=
github.com/wsxiaoys/terminal v0.0.0-20160513160801-0940f3fc43a0/go.mod h1:IXCdmsXIht47RaVFLEdVnh1t+pgYtTAhQGj73kz+2DM=
github.com/xanzy/go-gitlab v0.50.3 h1:M7ncgNhCN4jaFNyXxarJhCLa9Qi6fdmCxFFhMTQPZiY=
github.com/xanzy/go-gitlab v0.50.3/go.mod h1:Q+hQhV508bDPoBijv7YjK/Lvlb4PhVhJdKqXVQrUoAE=
github.com/xanzy/go-gitlab v0.51.1 h1:wWKLalwx4omxFoHh3PLs9zDgAD4GXDP/uoxwMRCSiWM=
github.com/xanzy/go-gitlab v0.51.1/go.mod h1:Q+hQhV508bDPoBijv7YjK/Lvlb4PhVhJdKqXVQrUoAE=
github.com/xeipuuv/gojsonpointer v0.0.0-20180127040702-4e3ac2762d5f/go.mod h1:N2zxlSyiKSe5eX1tZViRH5QA0qijqEDrYZiPEAiq3wU=
github.com/xeipuuv/gojsonreference v0.0.0-20180127040603-bd5ef7bd5415/go.mod h1:GwrjFmJcFw6At/Gs6z4yjiIwzuJ1/+UwLxMQDVQXShQ=
github.com/xeipuuv/gojsonschema v1.2.0/go.mod h1:anYRn/JVcOK2ZgGU+IjEV4nwlhoK5sQluxsYJ78Id3Y=
@ -834,6 +851,8 @@ golang.org/x/crypto v0.0.0-20190701094942-4def268fd1a4/go.mod h1:yigFU9vqHzYiE8U
golang.org/x/crypto v0.0.0-20191011191535-87dc89f01550/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=
golang.org/x/crypto v0.0.0-20200622213623-75b288015ac9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
golang.org/x/crypto v0.0.0-20201112155050-0c6587e931a9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
golang.org/x/crypto v0.0.0-20210711020723-a769d52b0f97 h1:/UOmuWzQfxxo9UtlXMwuQU8CMgg1eZXqTRwkSQJWKOI=
golang.org/x/crypto v0.0.0-20210711020723-a769d52b0f97/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/exp v0.0.0-20190121172915-509febef88a4/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
golang.org/x/exp v0.0.0-20190306152737-a1d7652674e8/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
golang.org/x/exp v0.0.0-20190510132918-efd6b22b2522/go.mod h1:ZjyILWgesfNpC6sMxTJOJm9Kp84zZh5NQWvqDGG3Qr8=
@ -922,16 +941,18 @@ golang.org/x/net v0.0.0-20210521195947-fe42d452be8f/go.mod h1:9nx3DQGgdP8bBQD5qx
golang.org/x/net v0.0.0-20210614182718-04defd469f4e/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20210813160813-60bc85c4be6d/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20210825183410-e898025ed96a/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20210916014120-12bc252f5db8 h1:/6y1LfuqNuQdHAm0jjtPtgRcxIxjVZgm5OTu8/QhZvk=
golang.org/x/net v0.0.0-20210916014120-12bc252f5db8/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20211020060615-d418f374d309 h1:A0lJIi+hcTR6aajJH4YqKWwohY4aW9RO7oRMcdv+HKI=
golang.org/x/net v0.0.0-20211020060615-d418f374d309/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
golang.org/x/oauth2 v0.0.0-20181106182150-f42d05182288/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
golang.org/x/oauth2 v0.0.0-20190226205417-e64efc72b421/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
golang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
golang.org/x/oauth2 v0.0.0-20191202225959-858c2ad4c8b6/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
golang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
golang.org/x/oauth2 v0.0.0-20210817223510-7df4dd6e12ab h1:llrcWN/wOwO+6gAyfBzxb5hZ+c3mriU/0+KNgYu6adA=
golang.org/x/oauth2 v0.0.0-20210817223510-7df4dd6e12ab/go.mod h1:KelEdhl1UZF7XfJ4dDtk6s++YSgaE7mD/BuKKDLBl4A=
golang.org/x/oauth2 v0.0.0-20211005180243-6b3c2da341f1 h1:B333XXssMuKQeBwiNODx4TupZy7bf4sxFZnN2ZOcvUE=
golang.org/x/oauth2 v0.0.0-20211005180243-6b3c2da341f1/go.mod h1:KelEdhl1UZF7XfJ4dDtk6s++YSgaE7mD/BuKKDLBl4A=
golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
@ -995,6 +1016,7 @@ golang.org/x/sys v0.0.0-20200923182605-d9f96fdee20d/go.mod h1:h1NjWce9XRLGQEsW7w
golang.org/x/sys v0.0.0-20200930185726-fdedc70b468f/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20201113233024-12cec1faf1ba/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20201207223542-d4d67f95c62d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210112080510-489259a85091/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210119212857-b64e53b001e4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210303074136-134d130e1a04/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
@ -1004,9 +1026,13 @@ golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7w
golang.org/x/sys v0.0.0-20210426230700-d19ff857e887/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210510120138-977fb7262007/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210601080250-7ecdf8ef093b/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210616094352-59db8d763f22/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210630005230-0f9fa26af87c/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210806184541-e5e7981a1069/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210816074244-15123e1e1f71/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210823070655-63515b42dcdf/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210831042530-f4d43177bf5e/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210915083310-ed5796bab164 h1:7ZDGnxgHAMw7thfC5bEos0RDAccZKxioiWBhfIe+tvw=
golang.org/x/sys v0.0.0-20210915083310-ed5796bab164/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=

View File

@ -20,6 +20,6 @@ func showBanner() {
gologger.Print().Msgf("%s\n", banner)
gologger.Print().Msgf("\t\tprojectdiscovery.io\n\n")
gologger.Error().Label("WRN").Msgf("Use with caution. You are responsible for your actions.\n")
gologger.Error().Label("WRN").Msgf("Developers assume no liability and are not responsible for any misuse or damage.\n")
gologger.Print().Label("WRN").Msgf("Use with caution. You are responsible for your actions.\n")
gologger.Print().Label("WRN").Msgf("Developers assume no liability and are not responsible for any misuse or damage.\n")
}

View File

@ -2,12 +2,13 @@ package runner
import (
"bufio"
"errors"
"net/url"
"os"
"path/filepath"
"strings"
"github.com/pkg/errors"
"github.com/go-playground/validator/v10"
"github.com/projectdiscovery/fileutil"
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/gologger/formatter"
@ -24,7 +25,6 @@ func ParseOptions(options *types.Options) {
// Read the inputs and configure the logging
configureOutput(options)
// Show the user the banner
showBanner()
@ -47,13 +47,6 @@ func ParseOptions(options *types.Options) {
gologger.Fatal().Msgf("Program exiting: %s\n", err)
}
// Auto adjust rate limits when using headless mode if the user
// hasn't specified any custom limits.
if options.Headless && options.BulkSize == 25 && options.TemplateThreads == 10 {
options.BulkSize = 2
options.TemplateThreads = 2
}
// Load the resolvers if user asked for them
loadResolvers(options)
@ -78,47 +71,47 @@ func hasStdin() bool {
return false
}
isPipedFromChrDev := (stat.Mode() & os.ModeCharDevice) == 0
isPipedFromChrDev := (stat.Mode() & os.ModeCharDevice) != 0
isPipedFromFIFO := (stat.Mode() & os.ModeNamedPipe) != 0
return isPipedFromChrDev || isPipedFromFIFO
}
// validateOptions validates the configuration options passed
func validateOptions(options *types.Options) error {
validate := validator.New()
if err := validate.Struct(options); err != nil {
if _, ok := err.(*validator.InvalidValidationError); ok {
return err
}
errs := []string{}
for _, err := range err.(validator.ValidationErrors) {
errs = append(errs, err.Namespace()+": "+err.Tag())
}
return errors.Wrap(errors.New(strings.Join(errs, ", ")), "validation failed for these fields")
}
if options.Verbose && options.Silent {
return errors.New("both verbose and silent mode specified")
}
if err := validateProxyURL(options.ProxyURL, "invalid http proxy format (It should be http://username:password@host:port)"); err != nil {
//loading the proxy server list from file or cli and test the connectivity
if err := loadProxyServers(options); err != nil {
return err
}
if err := validateProxyURL(options.ProxySocksURL, "invalid socks proxy format (It should be socks5://username:password@host:port)"); err != nil {
return err
}
if options.Validate {
options.Headless = true // required for correct validation of headless templates
validateTemplatePaths(options.TemplatesDirectory, options.Templates, options.Workflows)
}
return nil
}
func validateProxyURL(proxyURL, message string) error {
if proxyURL != "" && !isValidURL(proxyURL) {
return errors.New(message)
// Verify if any of the client certificate options were set since it requires all three to work properly
if len(options.ClientCertFile) > 0 || len(options.ClientKeyFile) > 0 || len(options.ClientCAFile) > 0 {
if len(options.ClientCertFile) == 0 || len(options.ClientKeyFile) == 0 || len(options.ClientCAFile) == 0 {
return errors.New("if a client certification option is provided, then all three must be provided")
}
validateCertificatePaths([]string{options.ClientCertFile, options.ClientKeyFile, options.ClientCAFile})
}
return nil
}
func isValidURL(urlString string) bool {
_, err := url.Parse(urlString)
return err == nil
}
// configureOutput configures the output logging levels to be displayed on the screen
func configureOutput(options *types.Options) {
// If the user desires verbose output, show verbose output
@ -164,7 +157,6 @@ func loadResolvers(options *types.Options) {
func validateTemplatePaths(templatesDirectory string, templatePaths, workflowPaths []string) {
allGivenTemplatePaths := append(templatePaths, workflowPaths...)
for _, templatePath := range allGivenTemplatePaths {
if templatesDirectory != templatePath && filepath.IsAbs(templatePath) {
fileInfo, err := os.Stat(templatePath)
@ -179,3 +171,14 @@ func validateTemplatePaths(templatesDirectory string, templatePaths, workflowPat
}
}
}
func validateCertificatePaths(certificatePaths []string) {
for _, certificatePath := range certificatePaths {
if _, err := os.Stat(certificatePath); os.IsNotExist(err) {
// The provided path to the PEM certificate does not exist for the client authentication. As this is
// required for successful authentication, log and return an error
gologger.Fatal().Msgf("The given path (%s) to the certificate does not exist!", certificatePath)
break
}
}
}

View File

@ -1,81 +0,0 @@
package runner
import (
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
"github.com/remeh/sizedwaitgroup"
"go.uber.org/atomic"
)
// processSelfContainedTemplates execute a self-contained template.
func (r *Runner) processSelfContainedTemplates(template *templates.Template) bool {
match, err := template.Executer.Execute("")
if err != nil {
gologger.Warning().Msgf("[%s] Could not execute step: %s\n", r.colorizer.BrightBlue(template.ID), err)
}
return match
}
// processTemplateWithList execute a template against the list of user provided targets
func (r *Runner) processTemplateWithList(template *templates.Template) bool {
results := &atomic.Bool{}
wg := sizedwaitgroup.New(r.options.BulkSize)
processItem := func(k, _ []byte) error {
URL := string(k)
// Skip if the host has had errors
if r.hostErrors != nil && r.hostErrors.Check(URL) {
return nil
}
wg.Add()
go func(URL string) {
defer wg.Done()
match, err := template.Executer.Execute(URL)
if err != nil {
gologger.Warning().Msgf("[%s] Could not execute step: %s\n", r.colorizer.BrightBlue(template.ID), err)
}
results.CAS(false, match)
}(URL)
return nil
}
if r.options.Stream {
_ = r.hostMapStream.Scan(processItem)
} else {
r.hostMap.Scan(processItem)
}
wg.Wait()
return results.Load()
}
// processTemplateWithList process a template on the URL list
func (r *Runner) processWorkflowWithList(template *templates.Template) bool {
results := &atomic.Bool{}
wg := sizedwaitgroup.New(r.options.BulkSize)
processItem := func(k, _ []byte) error {
URL := string(k)
// Skip if the host has had errors
if r.hostErrors != nil && r.hostErrors.Check(URL) {
return nil
}
wg.Add()
go func(URL string) {
defer wg.Done()
match := template.CompiledWorkflow.RunWorkflow(URL)
results.CAS(false, match)
}(URL)
return nil
}
if r.options.Stream {
_ = r.hostMapStream.Scan(processItem)
} else {
r.hostMap.Scan(processItem)
}
wg.Wait()
return results.Load()
}

123
v2/internal/runner/proxy.go Normal file
View File

@ -0,0 +1,123 @@
package runner
import (
"bufio"
"errors"
"fmt"
"net"
"net/url"
"os"
"strings"
"time"
"github.com/projectdiscovery/fileutil"
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
)
var proxyURLList []url.URL
// loadProxyServers load list of proxy servers from file or comma seperated
func loadProxyServers(options *types.Options) error {
if len(options.Proxy) == 0 {
return nil
}
for _, p := range options.Proxy {
if proxyURL, err := validateProxyURL(p); err == nil {
proxyURLList = append(proxyURLList, proxyURL)
} else if fileutil.FileExists(p) {
file, err := os.Open(p)
if err != nil {
return fmt.Errorf("could not open proxy file: %s", err)
}
defer file.Close()
scanner := bufio.NewScanner(file)
for scanner.Scan() {
proxy := scanner.Text()
if strings.TrimSpace(proxy) == "" {
continue
}
if proxyURL, err := validateProxyURL(proxy); err != nil {
return err
} else {
proxyURLList = append(proxyURLList, proxyURL)
}
}
} else {
return fmt.Errorf("invalid proxy file or URL provided for %s", p)
}
}
return processProxyList(options)
}
func processProxyList(options *types.Options) error {
if len(proxyURLList) == 0 {
return fmt.Errorf("could not find any valid proxy")
} else {
done := make(chan bool)
exitCounter := make(chan bool)
counter := 0
for _, url := range proxyURLList {
go runProxyConnectivity(url, options, done, exitCounter)
}
for {
select {
case <-done:
{
close(done)
return nil
}
case <-exitCounter:
{
if counter += 1; counter == len(proxyURLList) {
return errors.New("no reachable proxy found")
}
}
}
}
}
}
func runProxyConnectivity(proxyURL url.URL, options *types.Options, done chan bool, exitCounter chan bool) {
if err := testProxyConnection(proxyURL, options.Timeout); err == nil {
if types.ProxyURL == "" && types.ProxySocksURL == "" {
assignProxyURL(proxyURL, options)
done <- true
}
}
exitCounter <- true
}
func testProxyConnection(proxyURL url.URL, timeoutDelay int) error {
timeout := time.Duration(timeoutDelay) * time.Second
_, err := net.DialTimeout("tcp", fmt.Sprintf("%s:%s", proxyURL.Hostname(), proxyURL.Port()), timeout)
if err != nil {
return err
}
return nil
}
func assignProxyURL(proxyURL url.URL, options *types.Options) {
os.Setenv(types.HTTP_PROXY_ENV, proxyURL.String())
if proxyURL.Scheme == types.HTTP || proxyURL.Scheme == types.HTTPS {
types.ProxyURL = proxyURL.String()
types.ProxySocksURL = ""
gologger.Verbose().Msgf("Using %s as proxy server", proxyURL.String())
} else if proxyURL.Scheme == types.SOCKS5 {
types.ProxyURL = ""
types.ProxySocksURL = proxyURL.String()
gologger.Verbose().Msgf("Using %s as socket proxy server", proxyURL.String())
}
}
func validateProxyURL(proxy string) (url.URL, error) {
if url, err := url.Parse(proxy); err == nil && isSupportedProtocol(url.Scheme) {
return *url, nil
}
return url.URL{}, errors.New("invalid proxy format (It should be http[s]/socks5://[username:password@]host:port)")
}
//isSupportedProtocol checks given protocols are supported
func isSupportedProtocol(value string) bool {
return value == types.HTTP || value == types.HTTPS || value == types.SOCKS5
}

View File

@ -2,7 +2,6 @@ package runner
import (
"bufio"
"fmt"
"os"
"path/filepath"
"strings"
@ -10,27 +9,21 @@ import (
"github.com/logrusorgru/aurora"
"github.com/pkg/errors"
"github.com/remeh/sizedwaitgroup"
"github.com/rs/xid"
"go.uber.org/atomic"
"go.uber.org/ratelimit"
"gopkg.in/yaml.v2"
"github.com/projectdiscovery/filekv"
"github.com/projectdiscovery/fileutil"
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/hmap/store/hybrid"
"github.com/projectdiscovery/nuclei/v2/internal/colorizer"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/config"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/loader"
"github.com/projectdiscovery/nuclei/v2/pkg/core"
"github.com/projectdiscovery/nuclei/v2/pkg/core/inputs/hybrid"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/parsers"
"github.com/projectdiscovery/nuclei/v2/pkg/progress"
"github.com/projectdiscovery/nuclei/v2/pkg/projectfile"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/clusterer"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/hosterrorscache"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/interactsh"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/protocolinit"
@ -42,15 +35,13 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/types"
"github.com/projectdiscovery/nuclei/v2/pkg/utils"
"github.com/projectdiscovery/nuclei/v2/pkg/utils/stats"
yamlwrapper "github.com/projectdiscovery/nuclei/v2/pkg/utils/yaml"
)
// Runner is a client for running the enumeration process.
type Runner struct {
hostMap *hybrid.HybridMap
hostMapStream *filekv.FileDB
output output.Writer
interactsh *interactsh.Client
inputCount int64
templatesConfig *config.Config
options *types.Options
projectFile *projectfile.ProjectFile
@ -59,6 +50,7 @@ type Runner struct {
colorizer aurora.Aurora
issuesClient *reporting.Client
addColor func(severity.Severity) string
hmapInputProvider *hybrid.Input
browser *engine.Browser
ratelimiter ratelimit.Limiter
hostErrors *hosterrorscache.Cache
@ -116,106 +108,16 @@ func New(options *types.Options) (*Runner, error) {
if (len(options.Templates) == 0 || !options.NewTemplates || (options.TargetsFilePath == "" && !options.Stdin && len(options.Targets) == 0)) && options.UpdateTemplates {
os.Exit(0)
}
hm, err := hybrid.New(hybrid.DefaultDiskOptions)
// Initialize the input source
hmapInput, err := hybrid.New(options)
if err != nil {
return nil, errors.Wrap(err, "could not create temporary input file")
}
runner.hostMap = hm
if options.Stream {
fkvOptions := filekv.DefaultOptions
if tmpFileName, err := fileutil.GetTempFileName(); err != nil {
return nil, errors.Wrap(err, "could not create temporary input file")
} else {
fkvOptions.Path = tmpFileName
}
fkv, err := filekv.Open(fkvOptions)
if err != nil {
return nil, errors.Wrap(err, "could not create temporary unsorted input file")
}
runner.hostMapStream = fkv
}
runner.inputCount = 0
dupeCount := 0
// Handle multiple targets
if len(options.Targets) != 0 {
for _, target := range options.Targets {
url := strings.TrimSpace(target)
if url == "" {
continue
}
if _, ok := runner.hostMap.Get(url); ok {
dupeCount++
continue
}
runner.inputCount++
// nolint:errcheck // ignoring error
runner.hostMap.Set(url, nil)
if options.Stream {
_ = runner.hostMapStream.Set([]byte(url), nil)
}
}
}
// Handle stdin
if options.Stdin {
scanner := bufio.NewScanner(os.Stdin)
for scanner.Scan() {
url := strings.TrimSpace(scanner.Text())
if url == "" {
continue
}
if _, ok := runner.hostMap.Get(url); ok {
dupeCount++
continue
}
runner.inputCount++
// nolint:errcheck // ignoring error
runner.hostMap.Set(url, nil)
if options.Stream {
_ = runner.hostMapStream.Set([]byte(url), nil)
}
}
}
// Handle target file
if options.TargetsFilePath != "" {
input, inputErr := os.Open(options.TargetsFilePath)
if inputErr != nil {
return nil, errors.Wrap(inputErr, "could not open targets file")
}
scanner := bufio.NewScanner(input)
for scanner.Scan() {
url := strings.TrimSpace(scanner.Text())
if url == "" {
continue
}
if _, ok := runner.hostMap.Get(url); ok {
dupeCount++
continue
}
runner.inputCount++
// nolint:errcheck // ignoring error
runner.hostMap.Set(url, nil)
if options.Stream {
_ = runner.hostMapStream.Set([]byte(url), nil)
}
}
input.Close()
}
if dupeCount > 0 {
gologger.Info().Msgf("Supplied input was automatically deduplicated (%d removed).", dupeCount)
return nil, errors.Wrap(err, "could not create input provider")
}
runner.hmapInputProvider = hmapInput
// Create the output file if asked
outputWriter, err := output.NewStandardWriter(!options.NoColor, options.NoMeta, options.NoTimestamp, options.JSON, options.JSONRequests, options.Output, options.TraceLogFile)
outputWriter, err := output.NewStandardWriter(!options.NoColor, options.NoMeta, options.NoTimestamp, options.JSON, options.JSONRequests, options.MatcherStatus, options.Output, options.TraceLogFile, options.ErrorLogFile)
if err != nil {
return nil, errors.Wrap(err, "could not create output file")
}
@ -243,25 +145,22 @@ func New(options *types.Options) (*Runner, error) {
}
}
if !options.NoInteractsh {
interactshClient, err := interactsh.New(&interactsh.Options{
ServerURL: options.InteractshURL,
Authorization: options.InteractshToken,
CacheSize: int64(options.InteractionsCacheSize),
Eviction: time.Duration(options.InteractionsEviction) * time.Second,
ColldownPeriod: time.Duration(options.InteractionsColldownPeriod) * time.Second,
PollDuration: time.Duration(options.InteractionsPollDuration) * time.Second,
Output: runner.output,
IssuesClient: runner.issuesClient,
Progress: runner.progress,
Debug: runner.options.Debug,
})
opts := interactsh.NewDefaultOptions(runner.output, runner.issuesClient, runner.progress)
opts.Debug = runner.options.Debug
opts.ServerURL = options.InteractshURL
opts.Authorization = options.InteractshToken
opts.CacheSize = int64(options.InteractionsCacheSize)
opts.Eviction = time.Duration(options.InteractionsEviction) * time.Second
opts.ColldownPeriod = time.Duration(options.InteractionsCooldownPeriod) * time.Second
opts.PollDuration = time.Duration(options.InteractionsPollDuration) * time.Second
opts.NoInteractsh = runner.options.NoInteractsh
interactshClient, err := interactsh.New(opts)
if err != nil {
gologger.Error().Msgf("Could not create interactsh client: %s", err)
} else {
runner.interactsh = interactshClient
}
}
if options.RateLimitMinute > 0 {
runner.ratelimiter = ratelimit.New(options.RateLimitMinute, ratelimit.Per(60*time.Second))
@ -282,9 +181,9 @@ func createReportingOptions(options *types.Options) (*reporting.Options, error)
}
reportingOptions = &reporting.Options{}
if parseErr := yaml.NewDecoder(file).Decode(reportingOptions); parseErr != nil {
if err := yamlwrapper.DecodeAndValidate(file, reportingOptions); err != nil {
file.Close()
return nil, errors.Wrap(parseErr, "could not parse reporting config file")
return nil, errors.Wrap(err, "could not parse reporting config file")
}
file.Close()
}
@ -312,13 +211,10 @@ func (r *Runner) Close() {
if r.output != nil {
r.output.Close()
}
r.hostMap.Close()
if r.projectFile != nil {
r.projectFile.Close()
}
if r.options.Stream {
r.hostMapStream.Close()
}
r.hmapInputProvider.Close()
protocolinit.Close()
}
@ -344,6 +240,9 @@ func (r *Runner) RunEnumeration() error {
cache = hosterrorscache.New(r.options.MaxHostError, hosterrorscache.DefaultMaxHostsCount).SetVerbose(r.options.Verbose)
}
r.hostErrors = cache
// Create the executer options which will be used throughout the execution
// stage by the nuclei engine modules.
executerOpts := protocols.ExecuterOptions{
Output: r.output,
Options: r.options,
@ -355,31 +254,18 @@ func (r *Runner) RunEnumeration() error {
ProjectFile: r.projectFile,
Browser: r.browser,
HostErrorsCache: cache,
Colorizer: r.colorizer,
}
engine := core.New(r.options)
engine.SetExecuterOptions(executerOpts)
workflowLoader, err := parsers.NewLoader(&executerOpts)
if err != nil {
return errors.Wrap(err, "Could not create loader.")
}
executerOpts.WorkflowLoader = workflowLoader
loaderConfig := loader.Config{
Templates: r.options.Templates,
Workflows: r.options.Workflows,
ExcludeTemplates: r.options.ExcludedTemplates,
Tags: r.options.Tags,
ExcludeTags: r.options.ExcludeTags,
IncludeTemplates: r.options.IncludeTemplates,
Authors: r.options.Author,
Severities: r.options.Severities,
ExcludeSeverities: r.options.ExcludeSeverities,
IncludeTags: r.options.IncludeTags,
TemplatesDirectory: r.options.TemplatesDirectory,
Catalog: r.catalog,
ExecutorOptions: executerOpts,
}
store, err := loader.New(&loaderConfig)
store, err := loader.New(loader.NewConfig(r.options, r.catalog, executerOpts))
if err != nil {
return errors.Wrap(err, "could not load templates from config")
}
@ -397,6 +283,78 @@ func (r *Runner) RunEnumeration() error {
return nil // exit
}
r.displayExecutionInfo(store)
var unclusteredRequests int64
for _, template := range store.Templates() {
// workflows will dynamically adjust the totals while running, as
// it can't be known in advance which requests will be called
if len(template.Workflows) > 0 {
continue
}
unclusteredRequests += int64(template.TotalRequests) * r.hmapInputProvider.Count()
}
if r.options.VerboseVerbose {
for _, template := range store.Templates() {
r.logAvailableTemplate(template.Path)
}
for _, template := range store.Workflows() {
r.logAvailableTemplate(template.Path)
}
}
// Cluster the templates first because we want info on how many
// templates did we cluster for showing to user in CLI
originalTemplatesCount := len(store.Templates())
finalTemplates, clusterCount := templates.ClusterTemplates(store.Templates(), engine.ExecuterOptions())
finalTemplates = append(finalTemplates, store.Workflows()...)
var totalRequests int64
for _, t := range finalTemplates {
if len(t.Workflows) > 0 {
continue
}
totalRequests += int64(t.TotalRequests) * r.hmapInputProvider.Count()
}
if totalRequests < unclusteredRequests {
gologger.Info().Msgf("Templates clustered: %d (Reduced %d HTTP Requests)", clusterCount, unclusteredRequests-totalRequests)
}
workflowCount := len(store.Workflows())
templateCount := originalTemplatesCount + workflowCount
// 0 matches means no templates were found in directory
if templateCount == 0 {
return errors.New("no valid templates were found")
}
// tracks global progress and captures stdout/stderr until p.Wait finishes
r.progress.Init(r.hmapInputProvider.Count(), templateCount, totalRequests)
results := engine.ExecuteWithOpts(finalTemplates, r.hmapInputProvider, true)
if r.interactsh != nil {
matched := r.interactsh.Close()
if matched {
results.CAS(false, true)
}
}
r.progress.Stop()
if r.issuesClient != nil {
r.issuesClient.Close()
}
if !results.Load() {
gologger.Info().Msgf("No results found. Better luck next time!")
}
if r.browser != nil {
r.browser.Close()
}
return nil
}
// displayExecutionInfo displays misc info about the nuclei engine execution
func (r *Runner) displayExecutionInfo(store *loader.Store) {
// Display stats for any loaded templates' syntax warnings or errors
stats.Display(parsers.SyntaxWarningStats)
stats.Display(parsers.SyntaxErrorStats)
@ -445,128 +403,6 @@ func (r *Runner) RunEnumeration() error {
if len(store.Workflows()) > 0 {
gologger.Info().Msgf("Workflows loaded for scan: %d", len(store.Workflows()))
}
// pre-parse all the templates, apply filters
finalTemplates := []*templates.Template{}
var unclusteredRequests int64
for _, template := range store.Templates() {
// workflows will dynamically adjust the totals while running, as
// it can't be known in advance which requests will be called
if len(template.Workflows) > 0 {
continue
}
unclusteredRequests += int64(template.TotalRequests) * r.inputCount
}
if r.options.VerboseVerbose {
for _, template := range store.Templates() {
r.logAvailableTemplate(template.Path)
}
for _, template := range store.Workflows() {
r.logAvailableTemplate(template.Path)
}
}
templatesMap := make(map[string]*templates.Template)
for _, v := range store.Templates() {
templatesMap[v.Path] = v
}
originalTemplatesCount := len(store.Templates())
clusterCount := 0
clusters := clusterer.Cluster(templatesMap)
for _, cluster := range clusters {
if len(cluster) > 1 && !r.options.OfflineHTTP {
executerOpts := protocols.ExecuterOptions{
Output: r.output,
Options: r.options,
Progress: r.progress,
Catalog: r.catalog,
RateLimiter: r.ratelimiter,
IssuesClient: r.issuesClient,
Browser: r.browser,
ProjectFile: r.projectFile,
Interactsh: r.interactsh,
HostErrorsCache: cache,
}
clusterID := fmt.Sprintf("cluster-%s", xid.New().String())
finalTemplates = append(finalTemplates, &templates.Template{
ID: clusterID,
RequestsHTTP: cluster[0].RequestsHTTP,
Executer: clusterer.NewExecuter(cluster, &executerOpts),
TotalRequests: len(cluster[0].RequestsHTTP),
})
clusterCount += len(cluster)
} else {
finalTemplates = append(finalTemplates, cluster...)
}
}
finalTemplates = append(finalTemplates, store.Workflows()...)
var totalRequests int64
for _, t := range finalTemplates {
if len(t.Workflows) > 0 {
continue
}
totalRequests += int64(t.TotalRequests) * r.inputCount
}
if totalRequests < unclusteredRequests {
gologger.Info().Msgf("Templates clustered: %d (Reduced %d HTTP Requests)", clusterCount, unclusteredRequests-totalRequests)
}
workflowCount := len(store.Workflows())
templateCount := originalTemplatesCount + workflowCount
// 0 matches means no templates were found in directory
if templateCount == 0 {
return errors.New("no valid templates were found")
}
/*
TODO does it make sense to run the logic below if there are no targets specified?
Can we safely assume the user is just experimenting with the template/workflow filters before running them?
*/
results := &atomic.Bool{}
wgtemplates := sizedwaitgroup.New(r.options.TemplateThreads)
// tracks global progress and captures stdout/stderr until p.Wait finishes
r.progress.Init(r.inputCount, templateCount, totalRequests)
for _, t := range finalTemplates {
wgtemplates.Add()
go func(template *templates.Template) {
defer wgtemplates.Done()
if template.SelfContained {
results.CAS(false, r.processSelfContainedTemplates(template))
} else if len(template.Workflows) > 0 {
results.CAS(false, r.processWorkflowWithList(template))
} else {
results.CAS(false, r.processTemplateWithList(template))
}
}(t)
}
wgtemplates.Wait()
if r.interactsh != nil {
matched := r.interactsh.Close()
if matched {
results.CAS(false, true)
}
}
r.progress.Stop()
if r.issuesClient != nil {
r.issuesClient.Close()
}
if !results.Load() {
gologger.Info().Msgf("No results found. Better luck next time!")
}
if r.browser != nil {
r.browser.Close()
}
return nil
}
// readNewTemplatesFile reads newly added templates from directory if it exists

View File

@ -13,8 +13,8 @@ import (
"testing"
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/config"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
"github.com/stretchr/testify/require"
)

View File

@ -26,7 +26,7 @@ type Config struct {
const nucleiConfigFilename = ".templates-config.json"
// Version is the current version of nuclei
const Version = `2.5.3`
const Version = `2.5.4-dev`
func getConfigDetails() (string, error) {
homeDir, err := os.UserHomeDir()

View File

@ -5,6 +5,7 @@ import (
"strings"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
)
// TagFilter is used to filter nuclei templates for tag based execution
@ -15,6 +16,8 @@ type TagFilter struct {
authors map[string]struct{}
block map[string]struct{}
matchAllows map[string]struct{}
types map[types.ProtocolType]struct{}
excludeTypes map[types.ProtocolType]struct{}
}
// ErrExcluded is returned for excluded templates
@ -25,7 +28,7 @@ var ErrExcluded = errors.New("the template was excluded")
// unless it is explicitly specified by user using the includeTags (matchAllows field).
// Matching rule: (tag1 OR tag2...) AND (author1 OR author2...) AND (severity1 OR severity2...) AND (extraTags1 OR extraTags2...)
// Returns true if the template matches the filter criteria, false otherwise.
func (tagFilter *TagFilter) Match(templateTags, templateAuthors []string, templateSeverity severity.Severity, extraTags []string) (bool, error) {
func (tagFilter *TagFilter) Match(templateTags, templateAuthors []string, templateSeverity severity.Severity, extraTags []string, templateType types.ProtocolType) (bool, error) {
for _, templateTag := range templateTags {
_, blocked := tagFilter.block[templateTag]
_, allowed := tagFilter.matchAllows[templateTag]
@ -51,6 +54,9 @@ func (tagFilter *TagFilter) Match(templateTags, templateAuthors []string, templa
return false, nil
}
if !isTemplateTypeMatch(tagFilter, templateType) {
return false, nil
}
return true, nil
}
@ -116,6 +122,27 @@ func isTagMatch(tagFilter *TagFilter, templateTags []string) bool {
return false
}
func isTemplateTypeMatch(tagFilter *TagFilter, templateType types.ProtocolType) bool {
if len(tagFilter.excludeTypes) == 0 && len(tagFilter.types) == 0 {
return true
}
if templateType.String() == "" || templateType == types.InvalidProtocol {
return true
}
included := true
if len(tagFilter.types) > 0 {
_, included = tagFilter.types[templateType]
}
excluded := false
if len(tagFilter.excludeTypes) > 0 {
_, excluded = tagFilter.excludeTypes[templateType]
}
return included && !excluded
}
type Config struct {
Tags []string
ExcludeTags []string
@ -123,6 +150,8 @@ type Config struct {
Severities severity.Severities
ExcludeSeverities severity.Severities
IncludeTags []string
Protocols types.ProtocolTypes
ExcludeProtocols types.ProtocolTypes
}
// New returns a tag filter for nuclei tag based execution
@ -136,6 +165,8 @@ func New(config *Config) *TagFilter {
excludeSeverities: make(map[severity.Severity]struct{}),
block: make(map[string]struct{}),
matchAllows: make(map[string]struct{}),
types: make(map[types.ProtocolType]struct{}),
excludeTypes: make(map[types.ProtocolType]struct{}),
}
for _, tag := range config.ExcludeTags {
for _, val := range splitCommaTrim(tag) {
@ -177,6 +208,16 @@ func New(config *Config) *TagFilter {
delete(filter.block, val)
}
}
for _, tag := range config.Protocols {
if _, ok := filter.types[tag]; !ok {
filter.types[tag] = struct{}{}
}
}
for _, tag := range config.ExcludeProtocols {
if _, ok := filter.excludeTypes[tag]; !ok {
filter.excludeTypes[tag] = struct{}{}
}
}
return filter
}

View File

@ -6,6 +6,7 @@ import (
"github.com/stretchr/testify/require"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
)
func TestTagBasedFilter(t *testing.T) {
@ -15,19 +16,19 @@ func TestTagBasedFilter(t *testing.T) {
})
t.Run("true", func(t *testing.T) {
matched, _ := filter.Match([]string{"jira"}, []string{"pdteam"}, severity.Low, nil)
matched, _ := filter.Match([]string{"jira"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
require.True(t, matched, "could not get correct match")
})
t.Run("false", func(t *testing.T) {
matched, _ := filter.Match([]string{"consul"}, []string{"pdteam"}, severity.Low, nil)
matched, _ := filter.Match([]string{"consul"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
require.False(t, matched, "could not get correct match")
})
t.Run("match-extra-tags-positive", func(t *testing.T) {
matched, _ := filter.Match([]string{"cves", "vuln"}, []string{"pdteam"}, severity.Low, []string{"vuln"})
matched, _ := filter.Match([]string{"cves", "vuln"}, []string{"pdteam"}, severity.Low, []string{"vuln"}, types.HTTPProtocol)
require.True(t, matched, "could not get correct match")
})
t.Run("match-extra-tags-negative", func(t *testing.T) {
matched, _ := filter.Match([]string{"cves"}, []string{"pdteam"}, severity.Low, []string{"vuln"})
matched, _ := filter.Match([]string{"cves"}, []string{"pdteam"}, severity.Low, []string{"vuln"}, types.HTTPProtocol)
require.False(t, matched, "could not get correct match")
})
}
@ -36,7 +37,7 @@ func TestTagBasedFilter(t *testing.T) {
filter := New(&Config{
ExcludeTags: []string{"dos"},
})
matched, err := filter.Match([]string{"dos"}, []string{"pdteam"}, severity.Low, nil)
matched, err := filter.Match([]string{"dos"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
require.False(t, matched, "could not get correct match")
require.Equal(t, ErrExcluded, err, "could not get correct error")
})
@ -46,7 +47,7 @@ func TestTagBasedFilter(t *testing.T) {
ExcludeTags: []string{"dos", "fuzz"},
IncludeTags: []string{"fuzz"},
})
matched, err := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil)
matched, err := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
require.Nil(t, err, "could not get match")
require.True(t, matched, "could not get correct match")
})
@ -55,7 +56,7 @@ func TestTagBasedFilter(t *testing.T) {
Tags: []string{"fuzz"},
ExcludeTags: []string{"fuzz"},
})
matched, err := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil)
matched, err := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
require.Nil(t, err, "could not get match")
require.True(t, matched, "could not get correct match")
})
@ -63,24 +64,24 @@ func TestTagBasedFilter(t *testing.T) {
filter := New(&Config{
Authors: []string{"pdteam"},
})
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil)
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
require.True(t, matched, "could not get correct match")
})
t.Run("match-severity", func(t *testing.T) {
filter := New(&Config{
Severities: severity.Severities{severity.High},
})
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil)
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil, types.HTTPProtocol)
require.True(t, matched, "could not get correct match")
})
t.Run("match-exclude-severity", func(t *testing.T) {
filter := New(&Config{
ExcludeSeverities: severity.Severities{severity.Low},
})
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil)
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil, types.HTTPProtocol)
require.True(t, matched, "could not get correct match")
matched, _ = filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil)
matched, _ = filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
require.False(t, matched, "could not get correct match")
})
t.Run("match-exclude-with-tags", func(t *testing.T) {
@ -88,7 +89,7 @@ func TestTagBasedFilter(t *testing.T) {
Tags: []string{"tag"},
ExcludeTags: []string{"another"},
})
matched, _ := filter.Match([]string{"another"}, []string{"pdteam"}, severity.High, nil)
matched, _ := filter.Match([]string{"another"}, []string{"pdteam"}, severity.High, nil, types.HTTPProtocol)
require.False(t, matched, "could not get correct match")
})
t.Run("match-conditions", func(t *testing.T) {
@ -97,16 +98,33 @@ func TestTagBasedFilter(t *testing.T) {
Tags: []string{"jira"},
Severities: severity.Severities{severity.High},
})
matched, _ := filter.Match([]string{"jira", "cve"}, []string{"pdteam", "someOtherUser"}, severity.High, nil)
matched, _ := filter.Match([]string{"jira", "cve"}, []string{"pdteam", "someOtherUser"}, severity.High, nil, types.HTTPProtocol)
require.True(t, matched, "could not get correct match")
matched, _ = filter.Match([]string{"jira"}, []string{"pdteam"}, severity.Low, nil)
matched, _ = filter.Match([]string{"jira"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
require.False(t, matched, "could not get correct match")
matched, _ = filter.Match([]string{"jira"}, []string{"random"}, severity.Low, nil)
matched, _ = filter.Match([]string{"jira"}, []string{"random"}, severity.Low, nil, types.HTTPProtocol)
require.False(t, matched, "could not get correct match")
matched, _ = filter.Match([]string{"consul"}, []string{"random"}, severity.Low, nil)
matched, _ = filter.Match([]string{"consul"}, []string{"random"}, severity.Low, nil, types.HTTPProtocol)
require.False(t, matched, "could not get correct match")
})
t.Run("match-type", func(t *testing.T) {
filter := New(&Config{
Protocols: []types.ProtocolType{types.HTTPProtocol},
})
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil, types.HTTPProtocol)
require.True(t, matched, "could not get correct match")
})
t.Run("match-exclude-type", func(t *testing.T) {
filter := New(&Config{
ExcludeProtocols: []types.ProtocolType{types.HTTPProtocol},
})
matched, _ := filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.High, nil, types.DNSProtocol)
require.True(t, matched, "could not get correct match")
matched, _ = filter.Match([]string{"fuzz"}, []string{"pdteam"}, severity.Low, nil, types.HTTPProtocol)
require.False(t, matched, "could not get correct match")
})
}

View File

@ -10,17 +10,23 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/parsers"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
templateTypes "github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
)
// Config contains the configuration options for the loader
type Config struct {
Templates []string
TemplateURLs []string
Workflows []string
WorkflowURLs []string
ExcludeTemplates []string
IncludeTemplates []string
Tags []string
ExcludeTags []string
Protocols templateTypes.ProtocolTypes
ExcludeProtocols templateTypes.ProtocolTypes
Authors []string
Severities severity.Severities
ExcludeSeverities severity.Severities
@ -37,6 +43,7 @@ type Store struct {
pathFilter *filter.PathFilter
config *Config
finalTemplates []string
finalWorkflows []string
templates []*templates.Template
workflows []*templates.Template
@ -44,6 +51,30 @@ type Store struct {
preprocessor templates.Preprocessor
}
// NewConfig returns a new loader config
func NewConfig(options *types.Options, catalog *catalog.Catalog, executerOpts protocols.ExecuterOptions) *Config {
loaderConfig := Config{
Templates: options.Templates,
Workflows: options.Workflows,
TemplateURLs: options.TemplateURLs,
WorkflowURLs: options.WorkflowURLs,
ExcludeTemplates: options.ExcludedTemplates,
Tags: options.Tags,
ExcludeTags: options.ExcludeTags,
IncludeTemplates: options.IncludeTemplates,
Authors: options.Authors,
Severities: options.Severities,
ExcludeSeverities: options.ExcludeSeverities,
IncludeTags: options.IncludeTags,
TemplatesDirectory: options.TemplatesDirectory,
Protocols: options.Protocols,
ExcludeProtocols: options.ExcludeProtocols,
Catalog: catalog,
ExecutorOptions: executerOpts,
}
return &loaderConfig
}
// New creates a new template store based on provided configuration
func New(config *Config) (*Store, error) {
// Create a tag filter based on provided configuration
@ -56,18 +87,32 @@ func New(config *Config) (*Store, error) {
Severities: config.Severities,
ExcludeSeverities: config.ExcludeSeverities,
IncludeTags: config.IncludeTags,
Protocols: config.Protocols,
ExcludeProtocols: config.ExcludeProtocols,
}),
pathFilter: filter.NewPathFilter(&filter.PathFilterConfig{
IncludedTemplates: config.IncludeTemplates,
ExcludedTemplates: config.ExcludeTemplates,
}, config.Catalog),
finalTemplates: config.Templates,
finalWorkflows: config.Workflows,
}
urlbasedTemplatesProvided := len(config.TemplateURLs) > 0 || len(config.WorkflowURLs) > 0
if urlbasedTemplatesProvided {
remoteTemplates, remoteWorkflows, err := getRemoteTemplatesAndWorkflows(config.TemplateURLs, config.WorkflowURLs)
if err != nil {
return store, err
}
store.finalTemplates = append(store.finalTemplates, remoteTemplates...)
store.finalWorkflows = append(store.finalWorkflows, remoteWorkflows...)
}
// Handle a case with no templates or workflows, where we use base directory
if len(config.Templates) == 0 && len(config.Workflows) == 0 {
config.Templates = append(config.Templates, config.TemplatesDirectory)
if len(store.finalTemplates) == 0 && len(store.finalWorkflows) == 0 && !urlbasedTemplatesProvided {
store.finalTemplates = []string{config.TemplatesDirectory}
}
store.finalTemplates = append(store.finalTemplates, config.Templates...)
return store, nil
}
@ -90,7 +135,7 @@ func (store *Store) RegisterPreprocessor(preprocessor templates.Preprocessor) {
// the complete compiled templates for a nuclei execution configuration.
func (store *Store) Load() {
store.templates = store.LoadTemplates(store.finalTemplates)
store.workflows = store.LoadWorkflows(store.config.Workflows)
store.workflows = store.LoadWorkflows(store.finalWorkflows)
}
// ValidateTemplates takes a list of templates and validates them

View File

@ -0,0 +1,95 @@
package loader
import (
"bufio"
"fmt"
"net/http"
"strings"
"github.com/pkg/errors"
)
type ContentType string
const (
Template ContentType = "Template"
Workflow ContentType = "Workflow"
)
type RemoteContentError struct {
Content []string
Type ContentType
Error error
}
func getRemoteTemplatesAndWorkflows(templateURLs []string, workflowURLs []string) ([]string, []string, error) {
remoteContentErrorChannel := make(chan RemoteContentError)
for _, templateURL := range templateURLs {
go getRemoteContent(templateURL, remoteContentErrorChannel, Template)
}
for _, workflowURL := range workflowURLs {
go getRemoteContent(workflowURL, remoteContentErrorChannel, Workflow)
}
var remoteTemplateList []string
var remoteWorkFlowList []string
var err error
for i := 0; i < (len(templateURLs) + len(workflowURLs)); i++ {
remoteContentError := <-remoteContentErrorChannel
if remoteContentError.Error != nil {
if err != nil {
err = errors.New(remoteContentError.Error.Error() + ": " + err.Error())
} else {
err = remoteContentError.Error
}
} else {
if remoteContentError.Type == Template {
remoteTemplateList = append(remoteTemplateList, remoteContentError.Content...)
} else if remoteContentError.Type == Workflow {
remoteWorkFlowList = append(remoteWorkFlowList, remoteContentError.Content...)
}
}
}
return remoteTemplateList, remoteWorkFlowList, err
}
func getRemoteContent(URL string, w chan<- RemoteContentError, contentType ContentType) {
response, err := http.Get(URL)
if err != nil {
w <- RemoteContentError{
Error: err,
}
return
}
defer response.Body.Close()
if response.StatusCode < 200 || response.StatusCode > 299 {
w <- RemoteContentError{
Error: fmt.Errorf("get \"%s\": unexpect status %d", URL, response.StatusCode),
}
return
}
scanner := bufio.NewScanner(response.Body)
var templateList []string
for scanner.Scan() {
text := strings.TrimSpace(scanner.Text())
if text == "" {
continue
}
templateList = append(templateList, text)
}
if err := scanner.Err(); err != nil {
w <- RemoteContentError{
Error: errors.Wrap(err, "get \"%s\""),
}
return
}
w <- RemoteContentError{
Content: templateList,
Type: contentType,
}
}

59
v2/pkg/core/engine.go Normal file
View File

@ -0,0 +1,59 @@
package core
import (
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
)
// Engine is an executer for running Nuclei Templates/Workflows.
//
// The engine contains multiple thread pools which allow using different
// concurrency values per protocol executed.
//
// The engine does most of the heavy lifting of execution, from clustering
// templates to leading to the final execution by the workpool, it is
// handled by the engine.
type Engine struct {
workPool *WorkPool
options *types.Options
executerOpts protocols.ExecuterOptions
}
// InputProvider is an input providing interface for the nuclei execution
// engine.
//
// An example InputProvider implementation is provided in form of hybrid
// input provider in pkg/core/inputs/hybrid/hmap.go
type InputProvider interface {
// Count returns the number of items for input provider
Count() int64
// Scan iterates the input and each found item is passed to the
// callback consumer.
Scan(callback func(value string))
}
// New returns a new Engine instance
func New(options *types.Options) *Engine {
workPool := NewWorkPool(WorkPoolConfig{
InputConcurrency: options.BulkSize,
TypeConcurrency: options.TemplateThreads,
HeadlessInputConcurrency: options.HeadlessBulkSize,
HeadlessTypeConcurrency: options.HeadlessTemplateThreads,
})
engine := &Engine{
options: options,
workPool: workPool,
}
return engine
}
// SetExecuterOptions sets the executer options for the engine. This is required
// before using the engine to perform any execution.
func (e *Engine) SetExecuterOptions(options protocols.ExecuterOptions) {
e.executerOpts = options
}
// ExecuterOptions returns protocols.ExecuterOptions for nuclei engine.
func (e *Engine) ExecuterOptions() protocols.ExecuterOptions {
return e.executerOpts
}

View File

@ -0,0 +1 @@
package core

93
v2/pkg/core/execute.go Normal file
View File

@ -0,0 +1,93 @@
package core
import (
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
"github.com/remeh/sizedwaitgroup"
"go.uber.org/atomic"
)
// Execute takes a list of templates/workflows that have been compiled
// and executes them based on provided concurrency options.
//
// All the execution logic for the templates/workflows happens in this part
// of the engine.
func (e *Engine) Execute(templates []*templates.Template, target InputProvider) *atomic.Bool {
return e.ExecuteWithOpts(templates, target, false)
}
// ExecuteWithOpts is execute with the full options
func (e *Engine) ExecuteWithOpts(templatesList []*templates.Template, target InputProvider, noCluster bool) *atomic.Bool {
var finalTemplates []*templates.Template
if !noCluster {
finalTemplates, _ = templates.ClusterTemplates(templatesList, e.executerOpts)
} else {
finalTemplates = templatesList
}
results := &atomic.Bool{}
for _, template := range finalTemplates {
templateType := template.Type()
var wg *sizedwaitgroup.SizedWaitGroup
if templateType == types.HeadlessProtocol {
wg = e.workPool.Headless
} else {
wg = e.workPool.Default
}
wg.Add()
switch {
case template.SelfContained:
// Self Contained requests are executed here separately
e.executeSelfContainedTemplateWithInput(template, results)
default:
// All other request types are executed here
e.executeModelWithInput(templateType, template, target, results)
}
wg.Done()
}
e.workPool.Wait()
return results
}
// processSelfContainedTemplates execute a self-contained template.
func (e *Engine) executeSelfContainedTemplateWithInput(template *templates.Template, results *atomic.Bool) {
match, err := template.Executer.Execute("")
if err != nil {
gologger.Warning().Msgf("[%s] Could not execute step: %s\n", e.executerOpts.Colorizer.BrightBlue(template.ID), err)
}
results.CAS(false, match)
}
// executeModelWithInput executes a type of template with input
func (e *Engine) executeModelWithInput(templateType types.ProtocolType, template *templates.Template, target InputProvider, results *atomic.Bool) {
wg := e.workPool.InputPool(templateType)
target.Scan(func(scannedValue string) {
// Skip if the host has had errors
if e.executerOpts.HostErrorsCache != nil && e.executerOpts.HostErrorsCache.Check(scannedValue) {
return
}
wg.Waitgroup.Add()
go func(value string) {
defer wg.Waitgroup.Done()
var match bool
var err error
switch templateType {
case types.WorkflowProtocol:
match = e.executeWorkflow(value, template.CompiledWorkflow)
default:
match, err = template.Executer.Execute(value)
}
if err != nil {
gologger.Warning().Msgf("[%s] Could not execute step: %s\n", e.executerOpts.Colorizer.BrightBlue(template.ID), err)
}
results.CAS(false, match)
}(scannedValue)
})
wg.Waitgroup.Wait()
}

View File

@ -0,0 +1,134 @@
// Package hybrid implements a hybrid hmap/filekv backed input provider
// for nuclei that can either stream or store results using different kv stores.
package hybrid
import (
"bufio"
"io"
"os"
"strings"
"github.com/pkg/errors"
"github.com/projectdiscovery/filekv"
"github.com/projectdiscovery/fileutil"
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/hmap/store/hybrid"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
)
// Input is a hmap/filekv backed nuclei Input provider
type Input struct {
inputCount int64
dupeCount int64
hostMap *hybrid.HybridMap
hostMapStream *filekv.FileDB
}
// New creates a new hmap backed nuclei Input Provider
// and initializes it based on the passed options Model.
func New(options *types.Options) (*Input, error) {
hm, err := hybrid.New(hybrid.DefaultDiskOptions)
if err != nil {
return nil, errors.Wrap(err, "could not create temporary input file")
}
input := &Input{hostMap: hm}
if options.Stream {
fkvOptions := filekv.DefaultOptions
if tmpFileName, err := fileutil.GetTempFileName(); err != nil {
return nil, errors.Wrap(err, "could not create temporary input file")
} else {
fkvOptions.Path = tmpFileName
}
fkv, err := filekv.Open(fkvOptions)
if err != nil {
return nil, errors.Wrap(err, "could not create temporary unsorted input file")
}
input.hostMapStream = fkv
}
if initErr := input.initializeInputSources(options); initErr != nil {
return nil, initErr
}
if input.dupeCount > 0 {
gologger.Info().Msgf("Supplied input was automatically deduplicated (%d removed).", input.dupeCount)
}
return input, nil
}
// Close closes the input provider
func (i *Input) Close() {
i.hostMap.Close()
if i.hostMapStream != nil {
i.hostMapStream.Close()
}
}
// initializeInputSources initializes the input sources for hmap input
func (i *Input) initializeInputSources(options *types.Options) error {
// Handle targets flags
for _, target := range options.Targets {
i.normalizeStoreInputValue(target)
}
// Handle stdin
if options.Stdin {
i.scanInputFromReader(os.Stdin)
}
// Handle target file
if options.TargetsFilePath != "" {
input, inputErr := os.Open(options.TargetsFilePath)
if inputErr != nil {
return errors.Wrap(inputErr, "could not open targets file")
}
i.scanInputFromReader(input)
input.Close()
}
return nil
}
// scanInputFromReader scans a line of input from reader and passes it for storage
func (i *Input) scanInputFromReader(reader io.Reader) {
scanner := bufio.NewScanner(reader)
for scanner.Scan() {
i.normalizeStoreInputValue(scanner.Text())
}
}
// normalizeStoreInputValue normalizes and stores passed input values
func (i *Input) normalizeStoreInputValue(value string) {
url := strings.TrimSpace(value)
if url == "" {
return
}
if _, ok := i.hostMap.Get(url); ok {
i.dupeCount++
return
}
i.inputCount++
_ = i.hostMap.Set(url, nil)
if i.hostMapStream != nil {
_ = i.hostMapStream.Set([]byte(url), nil)
}
}
// Count returns the input count
func (i *Input) Count() int64 {
return i.inputCount
}
// Scan iterates the input and each found item is passed to the
// callback consumer.
func (i *Input) Scan(callback func(value string)) {
callbackFunc := func(k, _ []byte) error {
callback(string(k))
return nil
}
if i.hostMapStream != nil {
_ = i.hostMapStream.Scan(callbackFunc)
} else {
i.hostMap.Scan(callbackFunc)
}
}

View File

@ -0,0 +1,17 @@
package inputs
type SimpleInputProvider struct {
Inputs []string
}
// Count returns the number of items for input provider
func (s *SimpleInputProvider) Count() int64 {
return int64(len(s.Inputs))
}
// Scan calls a callback function till the input provider is exhausted
func (s *SimpleInputProvider) Scan(callback func(value string)) {
for _, v := range s.Inputs {
callback(v)
}
}

View File

@ -1,21 +1,22 @@
package workflows
package core
import (
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/workflows"
"github.com/remeh/sizedwaitgroup"
"go.uber.org/atomic"
)
// RunWorkflow runs a workflow on an input and returns true or false
func (w *Workflow) RunWorkflow(input string) bool {
// executeWorkflow runs a workflow on an input and returns true or false
func (e *Engine) executeWorkflow(input string, w *workflows.Workflow) bool {
results := &atomic.Bool{}
swg := sizedwaitgroup.New(w.Options.Options.TemplateThreads)
for _, template := range w.Workflows {
swg.Add()
func(template *WorkflowTemplate) {
if err := w.runWorkflowStep(template, input, results, &swg); err != nil {
func(template *workflows.WorkflowTemplate) {
if err := e.runWorkflowStep(template, input, results, &swg, w); err != nil {
gologger.Warning().Msgf("[%s] Could not execute workflow step: %s\n", template.Template, err)
}
swg.Done()
@ -27,7 +28,7 @@ func (w *Workflow) RunWorkflow(input string) bool {
// runWorkflowStep runs a workflow step for the workflow. It executes the workflow
// in a recursive manner running all subtemplates and matchers.
func (w *Workflow) runWorkflowStep(template *WorkflowTemplate, input string, results *atomic.Bool, swg *sizedwaitgroup.SizedWaitGroup) error {
func (e *Engine) runWorkflowStep(template *workflows.WorkflowTemplate, input string, results *atomic.Bool, swg *sizedwaitgroup.SizedWaitGroup, w *workflows.Workflow) error {
var firstMatched bool
var err error
var mainErr error
@ -90,8 +91,8 @@ func (w *Workflow) runWorkflowStep(template *WorkflowTemplate, input string, res
for _, subtemplate := range matcher.Subtemplates {
swg.Add()
go func(subtemplate *WorkflowTemplate) {
if err := w.runWorkflowStep(subtemplate, input, results, swg); err != nil {
go func(subtemplate *workflows.WorkflowTemplate) {
if err := e.runWorkflowStep(subtemplate, input, results, swg, w); err != nil {
gologger.Warning().Msgf("[%s] Could not execute workflow step: %s\n", subtemplate.Template, err)
}
swg.Done()
@ -114,8 +115,8 @@ func (w *Workflow) runWorkflowStep(template *WorkflowTemplate, input string, res
for _, subtemplate := range template.Subtemplates {
swg.Add()
go func(template *WorkflowTemplate) {
if err := w.runWorkflowStep(template, input, results, swg); err != nil {
go func(template *workflows.WorkflowTemplate) {
if err := e.runWorkflowStep(template, input, results, swg, w); err != nil {
gologger.Warning().Msgf("[%s] Could not execute workflow step: %s\n", template.Template, err)
}
swg.Done()

View File

@ -1,4 +1,4 @@
package workflows
package core
import (
"testing"
@ -10,18 +10,20 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/progress"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
"github.com/projectdiscovery/nuclei/v2/pkg/workflows"
)
func TestWorkflowsSimple(t *testing.T) {
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
{Executers: []*ProtocolExecuterPair{{
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}},
}}
matched := workflow.RunWorkflow("https://test.com")
engine := &Engine{}
matched := engine.executeWorkflow("https://test.com", workflow)
require.True(t, matched, "could not get correct match value")
}
@ -29,20 +31,21 @@ func TestWorkflowsSimpleMultiple(t *testing.T) {
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
var firstInput, secondInput string
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
{Executers: []*ProtocolExecuterPair{{
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true, executeHook: func(input string) {
firstInput = input
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}},
{Executers: []*ProtocolExecuterPair{{
{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true, executeHook: func(input string) {
secondInput = input
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}},
}}
matched := workflow.RunWorkflow("https://test.com")
engine := &Engine{}
matched := engine.executeWorkflow("https://test.com", workflow)
require.True(t, matched, "could not get correct match value")
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")
@ -53,21 +56,22 @@ func TestWorkflowsSubtemplates(t *testing.T) {
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
var firstInput, secondInput string
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
{Executers: []*ProtocolExecuterPair{{
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true, executeHook: func(input string) {
firstInput = input
}, outputs: []*output.InternalWrappedEvent{
{OperatorsResult: &operators.Result{}, Results: []*output.ResultEvent{{}}},
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}, Subtemplates: []*WorkflowTemplate{{Executers: []*ProtocolExecuterPair{{
}, Subtemplates: []*workflows.WorkflowTemplate{{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true, executeHook: func(input string) {
secondInput = input
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}}}},
}}
matched := workflow.RunWorkflow("https://test.com")
engine := &Engine{}
matched := engine.executeWorkflow("https://test.com", workflow)
require.True(t, matched, "could not get correct match value")
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")
@ -78,19 +82,20 @@ func TestWorkflowsSubtemplatesNoMatch(t *testing.T) {
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
var firstInput, secondInput string
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
{Executers: []*ProtocolExecuterPair{{
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: false, executeHook: func(input string) {
firstInput = input
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}, Subtemplates: []*WorkflowTemplate{{Executers: []*ProtocolExecuterPair{{
}, Subtemplates: []*workflows.WorkflowTemplate{{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true, executeHook: func(input string) {
secondInput = input
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}}}},
}}
matched := workflow.RunWorkflow("https://test.com")
engine := &Engine{}
matched := engine.executeWorkflow("https://test.com", workflow)
require.False(t, matched, "could not get correct match value")
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")
@ -101,8 +106,8 @@ func TestWorkflowsSubtemplatesWithMatcher(t *testing.T) {
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
var firstInput, secondInput string
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
{Executers: []*ProtocolExecuterPair{{
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true, executeHook: func(input string) {
firstInput = input
}, outputs: []*output.InternalWrappedEvent{
@ -111,14 +116,15 @@ func TestWorkflowsSubtemplatesWithMatcher(t *testing.T) {
Extracts: map[string][]string{},
}},
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}, Matchers: []*Matcher{{Name: "tomcat", Subtemplates: []*WorkflowTemplate{{Executers: []*ProtocolExecuterPair{{
}, Matchers: []*workflows.Matcher{{Name: "tomcat", Subtemplates: []*workflows.WorkflowTemplate{{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true, executeHook: func(input string) {
secondInput = input
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}}}}}},
}}
matched := workflow.RunWorkflow("https://test.com")
engine := &Engine{}
matched := engine.executeWorkflow("https://test.com", workflow)
require.True(t, matched, "could not get correct match value")
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")
@ -129,8 +135,8 @@ func TestWorkflowsSubtemplatesWithMatcherNoMatch(t *testing.T) {
progressBar, _ := progress.NewStatsTicker(0, false, false, false, 0)
var firstInput, secondInput string
workflow := &Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*WorkflowTemplate{
{Executers: []*ProtocolExecuterPair{{
workflow := &workflows.Workflow{Options: &protocols.ExecuterOptions{Options: &types.Options{TemplateThreads: 10}}, Workflows: []*workflows.WorkflowTemplate{
{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true, executeHook: func(input string) {
firstInput = input
}, outputs: []*output.InternalWrappedEvent{
@ -139,14 +145,15 @@ func TestWorkflowsSubtemplatesWithMatcherNoMatch(t *testing.T) {
Extracts: map[string][]string{},
}},
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}, Matchers: []*Matcher{{Name: "apache", Subtemplates: []*WorkflowTemplate{{Executers: []*ProtocolExecuterPair{{
}, Matchers: []*workflows.Matcher{{Name: "apache", Subtemplates: []*workflows.WorkflowTemplate{{Executers: []*workflows.ProtocolExecuterPair{{
Executer: &mockExecuter{result: true, executeHook: func(input string) {
secondInput = input
}}, Options: &protocols.ExecuterOptions{Progress: progressBar}},
}}}}}},
}}
matched := workflow.RunWorkflow("https://test.com")
engine := &Engine{}
matched := engine.executeWorkflow("https://test.com", workflow)
require.False(t, matched, "could not get correct match value")
require.Equal(t, "https://test.com", firstInput, "could not get correct first input")

64
v2/pkg/core/workpool.go Normal file
View File

@ -0,0 +1,64 @@
package core
import (
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
"github.com/remeh/sizedwaitgroup"
)
// WorkPool implements an execution pool for executing different
// types of task with different concurreny requirements.
//
// It also allows Configuration of such requirements. This is used
// for per-module like separate headless concurrency etc.
type WorkPool struct {
Headless *sizedwaitgroup.SizedWaitGroup
Default *sizedwaitgroup.SizedWaitGroup
config WorkPoolConfig
}
// WorkPoolConfig is the configuration for workpool
type WorkPoolConfig struct {
// InputConcurrency is the concurrency for inputs values.
InputConcurrency int
// TypeConcurrency is the concurrency for the request type templates.
TypeConcurrency int
// HeadlessInputConcurrency is the concurrency for headless inputs values.
HeadlessInputConcurrency int
// TypeConcurrency is the concurrency for the headless request type templates.
HeadlessTypeConcurrency int
}
// NewWorkPool returns a new WorkPool instance
func NewWorkPool(config WorkPoolConfig) *WorkPool {
headlessWg := sizedwaitgroup.New(config.HeadlessTypeConcurrency)
defaultWg := sizedwaitgroup.New(config.TypeConcurrency)
return &WorkPool{
config: config,
Headless: &headlessWg,
Default: &defaultWg,
}
}
// Wait waits for all the workpool waitgroups to finish
func (w *WorkPool) Wait() {
w.Default.Wait()
w.Headless.Wait()
}
// InputWorkPool is a workpool per-input
type InputWorkPool struct {
Waitgroup *sizedwaitgroup.SizedWaitGroup
}
// InputPool returns a workpool for an input type
func (w *WorkPool) InputPool(templateType types.ProtocolType) *InputWorkPool {
var count int
if templateType == types.HeadlessProtocol {
count = w.config.HeadlessInputConcurrency
} else {
count = w.config.InputConcurrency
}
swg := sizedwaitgroup.New(count)
return &InputWorkPool{Waitgroup: &swg}
}

View File

@ -1,6 +1,8 @@
package dsl
import (
"bytes"
"compress/gzip"
"crypto/md5"
"crypto/sha1"
"crypto/sha256"
@ -31,21 +33,38 @@ const (
withMaxRandArgsSize = withCutSetArgsSize
)
var ErrDSLArguments = errors.New("invalid arguments provided to dsl")
var functions = map[string]govaluate.ExpressionFunction{
"len": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
length := len(types.ToString(args[0]))
return float64(length), nil
},
"toupper": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return strings.ToUpper(types.ToString(args[0])), nil
},
"tolower": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return strings.ToLower(types.ToString(args[0])), nil
},
"replace": func(args ...interface{}) (interface{}, error) {
if len(args) != 3 {
return nil, ErrDSLArguments
}
return strings.ReplaceAll(types.ToString(args[0]), types.ToString(args[1]), types.ToString(args[2])), nil
},
"replace_regex": func(args ...interface{}) (interface{}, error) {
if len(args) != 3 {
return nil, ErrDSLArguments
}
compiled, err := regexp.Compile(types.ToString(args[1]))
if err != nil {
return nil, err
@ -53,66 +72,133 @@ var functions = map[string]govaluate.ExpressionFunction{
return compiled.ReplaceAllString(types.ToString(args[0]), types.ToString(args[2])), nil
},
"trim": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
return strings.Trim(types.ToString(args[0]), types.ToString(args[1])), nil
},
"trimleft": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
return strings.TrimLeft(types.ToString(args[0]), types.ToString(args[1])), nil
},
"trimright": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
return strings.TrimRight(types.ToString(args[0]), types.ToString(args[1])), nil
},
"trimspace": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return strings.TrimSpace(types.ToString(args[0])), nil
},
"trimprefix": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
return strings.TrimPrefix(types.ToString(args[0]), types.ToString(args[1])), nil
},
"trimsuffix": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
return strings.TrimSuffix(types.ToString(args[0]), types.ToString(args[1])), nil
},
"reverse": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return reverseString(types.ToString(args[0])), nil
},
// encoding
"base64": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
sEnc := base64.StdEncoding.EncodeToString([]byte(types.ToString(args[0])))
return sEnc, nil
},
"gzip": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
buffer := &bytes.Buffer{}
writer := gzip.NewWriter(buffer)
if _, err := writer.Write([]byte(args[0].(string))); err != nil {
return "", err
}
_ = writer.Close()
return buffer.String(), nil
},
// python encodes to base64 with lines of 76 bytes terminated by new line "\n"
"base64_py": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
sEnc := base64.StdEncoding.EncodeToString([]byte(types.ToString(args[0])))
return deserialization.InsertInto(sEnc, 76, '\n'), nil
},
"base64_decode": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return base64.StdEncoding.DecodeString(types.ToString(args[0]))
},
"url_encode": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return url.QueryEscape(types.ToString(args[0])), nil
},
"url_decode": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return url.QueryUnescape(types.ToString(args[0]))
},
"hex_encode": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return hex.EncodeToString([]byte(types.ToString(args[0]))), nil
},
"hex_decode": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
hx, _ := hex.DecodeString(types.ToString(args[0]))
return string(hx), nil
},
"html_escape": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return html.EscapeString(types.ToString(args[0])), nil
},
"html_unescape": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return html.UnescapeString(types.ToString(args[0])), nil
},
// hashing
"md5": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
hash := md5.Sum([]byte(types.ToString(args[0])))
return hex.EncodeToString(hash[:]), nil
},
"sha256": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
h := sha256.New()
if _, err := h.Write([]byte(types.ToString(args[0]))); err != nil {
return nil, err
@ -120,6 +206,9 @@ var functions = map[string]govaluate.ExpressionFunction{
return hex.EncodeToString(h.Sum(nil)), nil
},
"sha1": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
h := sha1.New()
if _, err := h.Write([]byte(types.ToString(args[0]))); err != nil {
return nil, err
@ -127,13 +216,22 @@ var functions = map[string]govaluate.ExpressionFunction{
return hex.EncodeToString(h.Sum(nil)), nil
},
"mmh3": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
return fmt.Sprintf("%d", int32(murmur3.Sum32WithSeed([]byte(types.ToString(args[0])), 0))), nil
},
// search
"contains": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
return strings.Contains(types.ToString(args[0]), types.ToString(args[1])), nil
},
"regex": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
compiled, err := regexp.Compile(types.ToString(args[0]))
if err != nil {
return nil, err
@ -142,6 +240,9 @@ var functions = map[string]govaluate.ExpressionFunction{
},
// random generators
"rand_char": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
chars := letters + numbers
bad := ""
if len(args) >= 1 {
@ -154,6 +255,9 @@ var functions = map[string]govaluate.ExpressionFunction{
return chars[rand.Intn(len(chars))], nil
},
"rand_base": func(args ...interface{}) (interface{}, error) {
if len(args) != 3 {
return nil, ErrDSLArguments
}
l := 0
bad := ""
base := letters + numbers
@ -171,6 +275,9 @@ var functions = map[string]govaluate.ExpressionFunction{
return randSeq(base, l), nil
},
"rand_text_alphanumeric": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
l := 0
bad := ""
chars := letters + numbers
@ -185,6 +292,9 @@ var functions = map[string]govaluate.ExpressionFunction{
return randSeq(chars, l), nil
},
"rand_text_alpha": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
l := 0
bad := ""
chars := letters
@ -199,6 +309,9 @@ var functions = map[string]govaluate.ExpressionFunction{
return randSeq(chars, l), nil
},
"rand_text_numeric": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
l := 0
bad := ""
chars := numbers
@ -213,6 +326,9 @@ var functions = map[string]govaluate.ExpressionFunction{
return randSeq(chars, l), nil
},
"rand_int": func(args ...interface{}) (interface{}, error) {
if len(args) != 2 {
return nil, ErrDSLArguments
}
min := 0
max := math.MaxInt32
@ -231,16 +347,22 @@ var functions = map[string]govaluate.ExpressionFunction{
}
now := time.Now()
offset := now.Add(time.Duration(seconds) * time.Second)
return offset.Unix(), nil
return float64(offset.Unix()), nil
},
// Time Functions
"waitfor": func(args ...interface{}) (interface{}, error) {
if len(args) != 1 {
return nil, ErrDSLArguments
}
seconds := args[0].(float64)
time.Sleep(time.Duration(seconds) * time.Second)
return true, nil
},
// deserialization Functions
"generate_java_gadget": func(args ...interface{}) (interface{}, error) {
if len(args) != 3 {
return nil, ErrDSLArguments
}
gadget := args[0].(string)
cmd := args[1].(string)

View File

@ -1,8 +1,14 @@
package dsl
import (
"compress/gzip"
"io/ioutil"
"strings"
"testing"
"time"
"github.com/Knetic/govaluate"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
"github.com/stretchr/testify/require"
)
@ -17,3 +23,25 @@ func TestDSLURLEncodeDecode(t *testing.T) {
require.Nil(t, err, "could not url encode")
require.Equal(t, "&test\"", decoded, "could not get url decoded data")
}
func TestDSLTimeComparison(t *testing.T) {
compiled, err := govaluate.NewEvaluableExpressionWithFunctions("unixtime() > not_after", HelperFunctions())
require.Nil(t, err, "could not compare time")
result, err := compiled.Evaluate(map[string]interface{}{"not_after": float64(time.Now().Unix() - 1000)})
require.Nil(t, err, "could not evaluate compare time")
require.Equal(t, true, result, "could not get url encoded data")
}
func TestDSLGzipSerialize(t *testing.T) {
compiled, err := govaluate.NewEvaluableExpressionWithFunctions("gzip(\"hello world\")", HelperFunctions())
require.Nil(t, err, "could not compare time")
result, err := compiled.Evaluate(make(map[string]interface{}))
require.Nil(t, err, "could not evaluate compare time")
reader, _ := gzip.NewReader(strings.NewReader(types.ToString(result)))
data, _ := ioutil.ReadAll(reader)
require.Equal(t, "hello world", string(data), "could not get gzip encoded data")
}

View File

@ -10,13 +10,12 @@ import (
// CompileExtractors performs the initial setup operation on an extractor
func (e *Extractor) CompileExtractors() error {
var ok bool
// Set up the extractor type
e.extractorType, ok = ExtractorTypes[e.Type]
if !ok {
computedType, err := toExtractorTypes(e.GetType().String())
if err != nil {
return fmt.Errorf("unknown extractor type specified: %s", e.Type)
}
e.extractorType = computedType
// Compile the regexes
for _, regex := range e.Regex {
compiled, err := regexp.Compile(regex)
@ -25,7 +24,6 @@ func (e *Extractor) CompileExtractors() error {
}
e.regexCompiled = append(e.regexCompiled, compiled)
}
for i, kval := range e.KVal {
e.KVal[i] = strings.ToLower(kval)
}
@ -42,9 +40,14 @@ func (e *Extractor) CompileExtractors() error {
e.jsonCompiled = append(e.jsonCompiled, compiled)
}
// Set up the part of the request to match, if any.
if e.Part == "" {
e.Part = "body"
if e.CaseInsensitive {
if e.GetType() != KValExtractor {
return fmt.Errorf("case-insensitive flag is supported only for 'kval' extractors (not '%s')", e.Type)
}
for i := range e.KVal {
e.KVal[i] = strings.ToLower(e.KVal[i])
}
}
return nil
}

View File

@ -34,8 +34,18 @@ func (e *Extractor) ExtractRegex(corpus string) map[string]struct{} {
// ExtractKval extracts key value pairs from a data map
func (e *Extractor) ExtractKval(data map[string]interface{}) map[string]struct{} {
results := make(map[string]struct{})
if e.CaseInsensitive {
inputData := data
data = make(map[string]interface{}, len(inputData))
for k, v := range inputData {
if s, ok := v.(string); ok {
v = strings.ToLower(s)
}
data[strings.ToLower(k)] = v
}
}
results := make(map[string]struct{})
for _, k := range e.KVal {
item, ok := data[k]
if !ok {

View File

@ -0,0 +1,105 @@
package extractors
import (
"encoding/json"
"errors"
"strings"
"github.com/alecthomas/jsonschema"
)
// ExtractorType is the type of the extractor specified
type ExtractorType int
const (
// RegexExtractor extracts responses with regexes
RegexExtractor ExtractorType = iota + 1
// KValExtractor extracts responses with key:value
KValExtractor
// XPathExtractor extracts responses with Xpath selectors
XPathExtractor
// JSONExtractor extracts responses with json
JSONExtractor
//limit
limit
)
// extractorMappings is a table for conversion of extractor type from string.
var extractorMappings = map[ExtractorType]string{
RegexExtractor: "regex",
KValExtractor: "kval",
XPathExtractor: "xpath",
JSONExtractor: "json",
}
// GetType returns the type of the matcher
func (e *Extractor) GetType() ExtractorType {
return e.Type.ExtractorType
}
// GetSupportedExtractorTypes returns list of supported types
func GetSupportedExtractorTypes() []ExtractorType {
var result []ExtractorType
for index := ExtractorType(1); index < limit; index++ {
result = append(result, index)
}
return result
}
func toExtractorTypes(valueToMap string) (ExtractorType, error) {
normalizedValue := normalizeValue(valueToMap)
for key, currentValue := range extractorMappings {
if normalizedValue == currentValue {
return key, nil
}
}
return -1, errors.New("Invalid extractor type: " + valueToMap)
}
func normalizeValue(value string) string {
return strings.TrimSpace(strings.ToLower(value))
}
func (t ExtractorType) String() string {
return extractorMappings[t]
}
// TypeHolder is used to hold internal type of the extractor
type TypeHolder struct {
ExtractorType ExtractorType
}
func (holder TypeHolder) JSONSchemaType() *jsonschema.Type {
gotType := &jsonschema.Type{
Type: "string",
Title: "type of the extractor",
Description: "Type of the extractor",
}
for _, types := range GetSupportedExtractorTypes() {
gotType.Enum = append(gotType.Enum, types.String())
}
return gotType
}
func (holder *TypeHolder) UnmarshalYAML(unmarshal func(interface{}) error) error {
var marshalledTypes string
if err := unmarshal(&marshalledTypes); err != nil {
return err
}
computedType, err := toExtractorTypes(marshalledTypes)
if err != nil {
return err
}
holder.ExtractorType = computedType
return nil
}
func (holder *TypeHolder) MarshalJSON() ([]byte, error) {
return json.Marshal(holder.ExtractorType.String())
}
func (holder TypeHolder) MarshalYAML() (interface{}, error) {
return holder.ExtractorType.String(), nil
}

View File

@ -21,7 +21,7 @@ type Extractor struct {
// - "kval"
// - "json"
// - "xpath"
Type string `yaml:"type" jsonschema:"title=type of the extractor,description=Type of the extractor,enum=regex,enum=kval,enum=json,enum=xpath"`
Type TypeHolder `json:"name,omitempty" yaml:"type"`
// extractorType is the internal type of the extractor
extractorType ExtractorType
@ -105,31 +105,11 @@ type Extractor struct {
// Internal, when set to true will allow using the value extracted
// in the next request for some protocols (like HTTP).
Internal bool `yaml:"internal,omitempty" jsonschema:"title=mark extracted value for internal variable use,description=Internal when set to true will allow using the value extracted in the next request for some protocols"`
}
// ExtractorType is the type of the extractor specified
type ExtractorType = int
const (
// RegexExtractor extracts responses with regexes
RegexExtractor ExtractorType = iota + 1
// KValExtractor extracts responses with key:value
KValExtractor
// XPathExtractor extracts responses with Xpath selectors
XPathExtractor
// JSONExtractor extracts responses with json
JSONExtractor
)
// ExtractorTypes is a table for conversion of extractor type from string.
var ExtractorTypes = map[string]ExtractorType{
"regex": RegexExtractor,
"kval": KValExtractor,
"xpath": XPathExtractor,
"json": JSONExtractor,
}
// GetType returns the type of the matcher
func (e *Extractor) GetType() ExtractorType {
return e.extractorType
// description: |
// CaseInsensitive enables case-insensitive extractions. Default is false.
// values:
// - false
// - true
CaseInsensitive bool `yaml:"case-insensitive,omitempty" jsonschema:"title=use case insensitive extract,description=use case insensitive extract"`
}

View File

@ -4,6 +4,7 @@ import (
"encoding/hex"
"fmt"
"regexp"
"strings"
"github.com/Knetic/govaluate"
@ -24,15 +25,18 @@ func (m *Matcher) CompileMatchers() error {
}
// Set up the matcher type
m.matcherType, ok = MatcherTypes[m.Type]
if !ok {
computedType, err := toMatcherTypes(m.GetType().String())
if err != nil {
return fmt.Errorf("unknown matcher type specified: %s", m.Type)
}
m.matcherType = computedType
// By default, match on body if user hasn't provided any specific items
if m.Part == "" {
m.Part = "body"
}
// Compile the regexes
for _, regex := range m.Regex {
compiled, err := regexp.Compile(regex)
@ -42,6 +46,15 @@ func (m *Matcher) CompileMatchers() error {
m.regexCompiled = append(m.regexCompiled, compiled)
}
// Compile and validate binary Values in matcher
for _, value := range m.Binary {
if decoded, err := hex.DecodeString(value); err != nil {
return fmt.Errorf("could not hex decode binary: %s", value)
} else {
m.binaryDecoded = append(m.binaryDecoded, string(decoded))
}
}
// Compile the dsl expressions
for _, expr := range m.DSL {
compiled, err := govaluate.NewEvaluableExpressionWithFunctions(expr, dsl.HelperFunctions())
@ -60,5 +73,14 @@ func (m *Matcher) CompileMatchers() error {
} else {
m.condition = ORCondition
}
if m.CaseInsensitive {
if m.GetType() != WordsMatcher {
return fmt.Errorf("case-insensitive flag is supported only for 'word' matchers (not '%s')", m.Type)
}
for i := range m.Words {
m.Words[i] = strings.ToLower(m.Words[i])
}
}
return nil
}

View File

@ -1,10 +1,8 @@
package matchers
import (
"encoding/hex"
"strings"
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/expressions"
)
@ -42,6 +40,10 @@ func (m *Matcher) MatchSize(length int) bool {
// MatchWords matches a word check against a corpus.
func (m *Matcher) MatchWords(corpus string, dynamicValues map[string]interface{}) (bool, []string) {
if m.CaseInsensitive {
corpus = strings.ToLower(corpus)
}
var matchedWords []string
// Iterate over all the words accepted as valid
for i, word := range m.Words {
@ -116,17 +118,8 @@ func (m *Matcher) MatchRegex(corpus string) (bool, []string) {
func (m *Matcher) MatchBinary(corpus string) (bool, []string) {
var matchedBinary []string
// Iterate over all the words accepted as valid
for i, binary := range m.Binary {
// Continue if the word doesn't match
hexa, err := hex.DecodeString(binary)
if err != nil {
gologger.Warning().Msgf("Could not hex encode the given binary matcher value: '%s'", binary)
if m.condition == ANDCondition {
return false, []string{}
}
continue
}
if !strings.Contains(corpus, string(hexa)) {
for i, binary := range m.binaryDecoded {
if !strings.Contains(corpus, binary) {
// If we are in an AND request and a match failed,
// return false as the AND condition fails on any single mismatch.
if m.condition == ANDCondition {
@ -138,10 +131,10 @@ func (m *Matcher) MatchBinary(corpus string) (bool, []string) {
// If the condition was an OR, return on the first match.
if m.condition == ORCondition {
return true, []string{string(hexa)}
return true, []string{binary}
}
matchedBinary = append(matchedBinary, string(hexa))
matchedBinary = append(matchedBinary, binary)
// If we are at the end of the words, return with true
if len(m.Binary)-1 == i {

View File

@ -19,7 +19,7 @@ func TestWordANDCondition(t *testing.T) {
}
func TestRegexANDCondition(t *testing.T) {
m := &Matcher{Type: "regex", Condition: "and", Regex: []string{"[a-z]{3}", "\\d{2}"}}
m := &Matcher{Type: MatcherTypeHolder{MatcherType: RegexMatcher}, Condition: "and", Regex: []string{"[a-z]{3}", "\\d{2}"}}
err := m.CompileMatchers()
require.Nil(t, err)
@ -49,7 +49,7 @@ func TestORCondition(t *testing.T) {
}
func TestRegexOrCondition(t *testing.T) {
m := &Matcher{Type: "regex", Condition: "or", Regex: []string{"[a-z]{3}", "\\d{2}"}}
m := &Matcher{Type: MatcherTypeHolder{MatcherType: RegexMatcher}, Condition: "or", Regex: []string{"[a-z]{3}", "\\d{2}"}}
err := m.CompileMatchers()
require.Nil(t, err)
@ -63,7 +63,7 @@ func TestRegexOrCondition(t *testing.T) {
}
func TestHexEncoding(t *testing.T) {
m := &Matcher{Encoding: "hex", Type: "word", Part: "body", Words: []string{"50494e47"}}
m := &Matcher{Encoding: "hex", Type: MatcherTypeHolder{MatcherType: WordsMatcher}, Part: "body", Words: []string{"50494e47"}}
err := m.CompileMatchers()
require.Nil(t, err, "could not compile matcher")

View File

@ -17,7 +17,7 @@ type Matcher struct {
// - "regex"
// - "binary"
// - "dsl"
Type string `yaml:"type" jsonschema:"title=type of matcher,description=Type of the matcher,enum=status,enum=size,enum=word,enum=regex,enum=binary,enum=dsl"`
Type MatcherTypeHolder `yaml:"type" jsonschema:"title=type of matcher,description=Type of the matcher,enum=status,enum=size,enum=word,enum=regex,enum=binary,enum=dsl"`
// description: |
// Condition is the optional condition between two matcher variables. By default,
// the condition is assumed to be OR.
@ -105,41 +105,21 @@ type Matcher struct {
// values:
// - "hex"
Encoding string `yaml:"encoding,omitempty" jsonschema:"title=encoding for word field,description=Optional encoding for the word fields,enum=hex"`
// description: |
// CaseInsensitive enables case-insensitive matches. Default is false.
// values:
// - false
// - true
CaseInsensitive bool `yaml:"case-insensitive,omitempty" jsonschema:"title=use case insensitive match,description=use case insensitive match"`
// cached data for the compiled matcher
condition ConditionType
matcherType MatcherType
binaryDecoded []string
regexCompiled []*regexp.Regexp
dslCompiled []*govaluate.EvaluableExpression
}
// MatcherType is the type of the matcher specified
type MatcherType = int
const (
// WordsMatcher matches responses with words
WordsMatcher MatcherType = iota + 1
// RegexMatcher matches responses with regexes
RegexMatcher
// BinaryMatcher matches responses with words
BinaryMatcher
// StatusMatcher matches responses with status codes
StatusMatcher
// SizeMatcher matches responses with response size
SizeMatcher
// DSLMatcher matches based upon dsl syntax
DSLMatcher
)
// MatcherTypes is a table for conversion of matcher type from string.
var MatcherTypes = map[string]MatcherType{
"status": StatusMatcher,
"size": SizeMatcher,
"word": WordsMatcher,
"regex": RegexMatcher,
"binary": BinaryMatcher,
"dsl": DSLMatcher,
}
// ConditionType is the type of condition for matcher
type ConditionType int
@ -173,7 +153,3 @@ func (m *Matcher) ResultWithMatchedSnippet(data bool, matchedSnippet []string) (
return data, matchedSnippet
}
// GetType returns the type of the matcher
func (m *Matcher) GetType() MatcherType {
return m.matcherType
}

View File

@ -0,0 +1,115 @@
package matchers
import (
"encoding/json"
"errors"
"strings"
"github.com/alecthomas/jsonschema"
)
// MatcherType is the type of the matcher specified
type MatcherType int
const (
// WordsMatcher matches responses with words
WordsMatcher MatcherType = iota + 1
// RegexMatcher matches responses with regexes
RegexMatcher
// BinaryMatcher matches responses with words
BinaryMatcher
// StatusMatcher matches responses with status codes
StatusMatcher
// SizeMatcher matches responses with response size
SizeMatcher
// DSLMatcher matches based upon dsl syntax
DSLMatcher
//limit
limit
)
// MatcherTypes is a table for conversion of matcher type from string.
var MatcherTypes = map[MatcherType]string{
StatusMatcher: "status",
SizeMatcher: "size",
WordsMatcher: "word",
RegexMatcher: "regex",
BinaryMatcher: "binary",
DSLMatcher: "dsl",
}
//GetType returns the type of the matcher
func (e *Matcher) GetType() MatcherType {
return e.Type.MatcherType
}
// GetSupportedMatcherTypes returns list of supported types
func GetSupportedMatcherTypes() []MatcherType {
var result []MatcherType
for index := MatcherType(1); index < limit; index++ {
result = append(result, index)
}
return result
}
func toMatcherTypes(valueToMap string) (MatcherType, error) {
normalizedValue := normalizeValue(valueToMap)
for key, currentValue := range MatcherTypes {
if normalizedValue == currentValue {
return key, nil
}
}
return -1, errors.New("Invalid matcher type: " + valueToMap)
}
func normalizeValue(value string) string {
return strings.TrimSpace(strings.ToLower(value))
}
func (t MatcherType) String() string {
return MatcherTypes[t]
}
// MatcherTypeHolder is used to hold internal type of the matcher
type MatcherTypeHolder struct {
MatcherType MatcherType
}
func (t MatcherTypeHolder) String() string {
return t.MatcherType.String()
}
func (holder MatcherTypeHolder) JSONSchemaType() *jsonschema.Type {
gotType := &jsonschema.Type{
Type: "string",
Title: "type of the matcher",
Description: "Type of the matcher,enum=status,enum=size,enum=word,enum=regex,enum=binary,enum=dsl",
}
for _, types := range GetSupportedMatcherTypes() {
gotType.Enum = append(gotType.Enum, types.String())
}
return gotType
}
func (holder *MatcherTypeHolder) UnmarshalYAML(unmarshal func(interface{}) error) error {
var marshalledTypes string
if err := unmarshal(&marshalledTypes); err != nil {
return err
}
computedType, err := toMatcherTypes(marshalledTypes)
if err != nil {
return err
}
holder.MatcherType = computedType
return nil
}
func (holder MatcherTypeHolder) MarshalJSON() ([]byte, error) {
return json.Marshal(holder.MatcherType.String())
}
func (holder MatcherTypeHolder) MarshalYAML() (interface{}, error) {
return holder.MatcherType.String(), nil
}

View File

@ -179,7 +179,7 @@ func getMatcherName(matcher *matchers.Matcher, matcherIndex int) string {
if matcher.Name != "" {
return matcher.Name
} else {
return matcher.Type + "-" + strconv.Itoa(matcherIndex+1) // making the index start from 1 to be more readable
return matcher.Type.String() + "-" + strconv.Itoa(matcherIndex+1) // making the index start from 1 to be more readable
}
}

View File

@ -2,11 +2,13 @@ package output
import (
"os"
"sync"
)
// fileWriter is a concurrent file based output writer.
type fileWriter struct {
file *os.File
mu sync.Mutex
}
// NewFileOutputWriter creates a new buffered writer for a file
@ -19,16 +21,22 @@ func newFileOutputWriter(file string) (*fileWriter, error) {
}
// WriteString writes an output to the underlying file
func (w *fileWriter) Write(data []byte) error {
func (w *fileWriter) Write(data []byte) (int, error) {
w.mu.Lock()
defer w.mu.Unlock()
if _, err := w.file.Write(data); err != nil {
return err
return 0, err
}
_, err := w.file.Write([]byte("\n"))
return err
if _, err := w.file.Write([]byte("\n")); err != nil {
return 0, err
}
return len(data) + 1, nil
}
// Close closes the underlying writer flushing everything to disk
func (w *fileWriter) Close() error {
w.mu.Lock()
defer w.mu.Unlock()
//nolint:errcheck // we don't care whether sync failed or succeeded.
w.file.Sync()
return w.file.Close()

View File

@ -27,6 +27,15 @@ func (w *StandardWriter) formatScreen(output *ResultEvent) []byte {
builder.WriteString(w.aurora.BrightGreen(output.ExtractorName).Bold().String())
}
if w.matcherStatus {
builder.WriteString("] [")
if !output.MatcherStatus {
builder.WriteString(w.aurora.Red("failed").String())
} else {
builder.WriteString(w.aurora.Green("matched").String())
}
}
builder.WriteString("] [")
builder.WriteString(w.aurora.BrightBlue(output.Type).String())
builder.WriteString("] ")
@ -35,7 +44,11 @@ func (w *StandardWriter) formatScreen(output *ResultEvent) []byte {
builder.WriteString(w.severityColors(output.Info.SeverityHolder.Severity))
builder.WriteString("] ")
}
if output.Matched != "" {
builder.WriteString(output.Matched)
} else {
builder.WriteString(output.Host)
}
// If any extractors, write the results
if len(output.ExtractedResults) > 0 {

View File

@ -1,9 +1,9 @@
package output
import (
"io"
"os"
"regexp"
"sync"
"time"
"github.com/pkg/errors"
@ -16,6 +16,8 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
"github.com/projectdiscovery/nuclei/v2/pkg/utils"
)
// Writer is an interface which writes output to somewhere for nuclei events.
@ -26,6 +28,8 @@ type Writer interface {
Colorizer() aurora.Aurora
// Write writes the event to file and/or screen.
Write(*ResultEvent) error
// WriteFailure writes the optional failure event for template to file and/or screen.
WriteFailure(event InternalEvent) error
// Request logs a request in the trace log
Request(templateID, url, requestType string, err error)
}
@ -36,11 +40,11 @@ type StandardWriter struct {
jsonReqResp bool
noTimestamp bool
noMetadata bool
matcherStatus bool
aurora aurora.Aurora
outputFile *fileWriter
outputMutex *sync.Mutex
traceFile *fileWriter
traceMutex *sync.Mutex
outputFile io.WriteCloser
traceFile io.WriteCloser
errorFile io.WriteCloser
severityColors func(severity.Severity) string
}
@ -54,10 +58,16 @@ type InternalWrappedEvent struct {
InternalEvent InternalEvent
Results []*ResultEvent
OperatorsResult *operators.Result
UsesInteractsh bool
}
// ResultEvent is a wrapped result event for a single nuclei output.
type ResultEvent struct {
// Template is the relative filename for the template
Template string `json:"template,omitempty"`
// TemplateURL is the URL of the template for the result inside the nuclei
// templates repository if it belongs to the repository.
TemplateURL string `json:"template-url,omitempty"`
// TemplateID is the ID of the template for the result.
TemplateID string `json:"template-id"`
// TemplatePath is the path of template
@ -93,14 +103,16 @@ type ResultEvent struct {
// CURLCommand is an optional curl command to reproduce the request
// Only applicable if the report is for HTTP.
CURLCommand string `json:"curl-command,omitempty"`
// MatcherStatus is the status of the match
MatcherStatus bool `json:"matcher-status"`
FileToIndexPosition map[string]int `json:"-"`
}
// NewStandardWriter creates a new output writer based on user configurations
func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp bool, file, traceFile string) (*StandardWriter, error) {
func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp, MatcherStatus bool, file, traceFile string, errorFile string) (*StandardWriter, error) {
auroraColorizer := aurora.NewAurora(colors)
var outputFile *fileWriter
var outputFile io.WriteCloser
if file != "" {
output, err := newFileOutputWriter(file)
if err != nil {
@ -108,7 +120,7 @@ func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp bool,
}
outputFile = output
}
var traceOutput *fileWriter
var traceOutput io.WriteCloser
if traceFile != "" {
output, err := newFileOutputWriter(traceFile)
if err != nil {
@ -116,16 +128,24 @@ func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp bool,
}
traceOutput = output
}
var errorOutput io.WriteCloser
if errorFile != "" {
output, err := newFileOutputWriter(errorFile)
if err != nil {
return nil, errors.Wrap(err, "could not create error file")
}
errorOutput = output
}
writer := &StandardWriter{
json: json,
jsonReqResp: jsonReqResp,
noMetadata: noMetadata,
matcherStatus: MatcherStatus,
noTimestamp: noTimestamp,
aurora: auroraColorizer,
outputFile: outputFile,
outputMutex: &sync.Mutex{},
traceFile: traceOutput,
traceMutex: &sync.Mutex{},
errorFile: errorOutput,
severityColors: colorizer.New(auroraColorizer),
}
return writer, nil
@ -133,6 +153,10 @@ func NewStandardWriter(colors, noMetadata, noTimestamp, json, jsonReqResp bool,
// Write writes the event to file and/or screen.
func (w *StandardWriter) Write(event *ResultEvent) error {
// Enrich the result event with extra metadata on the template-path and url.
if event.TemplatePath != "" {
event.Template, event.TemplateURL = utils.TemplatePathURL(types.ToString(event.TemplatePath))
}
event.Timestamp = time.Now()
var data []byte
@ -155,33 +179,33 @@ func (w *StandardWriter) Write(event *ResultEvent) error {
if !w.json {
data = decolorizerRegex.ReplaceAll(data, []byte(""))
}
if writeErr := w.outputFile.Write(data); writeErr != nil {
if _, writeErr := w.outputFile.Write(data); writeErr != nil {
return errors.Wrap(err, "could not write to output")
}
}
return nil
}
// JSONTraceRequest is a trace log request written to file
type JSONTraceRequest struct {
ID string `json:"id"`
URL string `json:"url"`
// JSONLogRequest is a trace/error log request written to file
type JSONLogRequest struct {
Template string `json:"template"`
Input string `json:"input"`
Error string `json:"error"`
Type string `json:"type"`
}
// Request writes a log the requests trace log
func (w *StandardWriter) Request(templateID, url, requestType string, err error) {
if w.traceFile == nil {
func (w *StandardWriter) Request(templatePath, input, requestType string, requestErr error) {
if w.traceFile == nil && w.errorFile == nil {
return
}
request := &JSONTraceRequest{
ID: templateID,
URL: url,
request := &JSONLogRequest{
Template: templatePath,
Input: input,
Type: requestType,
}
if err != nil {
request.Error = err.Error()
if unwrappedErr := utils.UnwrapError(requestErr); unwrappedErr != nil {
request.Error = unwrappedErr.Error()
} else {
request.Error = "none"
}
@ -190,9 +214,14 @@ func (w *StandardWriter) Request(templateID, url, requestType string, err error)
if err != nil {
return
}
w.traceMutex.Lock()
_ = w.traceFile.Write(data)
w.traceMutex.Unlock()
if w.traceFile != nil {
_, _ = w.traceFile.Write(data)
}
if requestErr != nil && w.errorFile != nil {
_, _ = w.errorFile.Write(data)
}
}
// Colorizer returns the colorizer instance for writer
@ -208,4 +237,27 @@ func (w *StandardWriter) Close() {
if w.traceFile != nil {
w.traceFile.Close()
}
if w.errorFile != nil {
w.errorFile.Close()
}
}
// WriteFailure writes the failure event for template to file and/or screen.
func (w *StandardWriter) WriteFailure(event InternalEvent) error {
if !w.matcherStatus {
return nil
}
templatePath, templateURL := utils.TemplatePathURL(types.ToString(event["template-path"]))
data := &ResultEvent{
Template: templatePath,
TemplateURL: templateURL,
TemplateID: types.ToString(event["template-id"]),
TemplatePath: types.ToString(event["template-path"]),
Info: event["template-info"].(model.Info),
Type: types.ToString(event["type"]),
Host: types.ToString(event["host"]),
MatcherStatus: false,
Timestamp: time.Now(),
}
return w.Write(data)
}

View File

@ -0,0 +1,59 @@
package output
import (
"fmt"
"strings"
"testing"
"github.com/pkg/errors"
"github.com/stretchr/testify/require"
)
func TestStandardWriterRequest(t *testing.T) {
t.Run("WithoutTraceAndError", func(t *testing.T) {
w, err := NewStandardWriter(false, false, false, false, false, false, "", "", "")
require.NoError(t, err)
require.NotPanics(t, func() {
w.Request("path", "input", "http", nil)
w.Close()
})
})
t.Run("TraceAndErrorWithoutError", func(t *testing.T) {
traceWriter := &testWriteCloser{}
errorWriter := &testWriteCloser{}
w, err := NewStandardWriter(false, false, false, false, false, false, "", "", "")
w.traceFile = traceWriter
w.errorFile = errorWriter
require.NoError(t, err)
w.Request("path", "input", "http", nil)
require.Equal(t, `{"template":"path","input":"input","error":"none","type":"http"}`, traceWriter.String())
require.Empty(t, errorWriter.String())
})
t.Run("ErrorWithWrappedError", func(t *testing.T) {
errorWriter := &testWriteCloser{}
w, err := NewStandardWriter(false, false, false, false, false, false, "", "", "")
w.errorFile = errorWriter
require.NoError(t, err)
w.Request(
"misconfiguration/tcpconfig.yaml",
"https://example.com/tcpconfig.html",
"http",
fmt.Errorf("GET https://example.com/tcpconfig.html/tcpconfig.html giving up after 2 attempts: %w", errors.New("context deadline exceeded (Client.Timeout exceeded while awaiting headers)")),
)
require.Equal(t, `{"template":"misconfiguration/tcpconfig.yaml","input":"https://example.com/tcpconfig.html","error":"context deadline exceeded (Client.Timeout exceeded while awaiting headers)","type":"http"}`, errorWriter.String())
})
}
type testWriteCloser struct {
strings.Builder
}
func (w testWriteCloser) Close() error {
return nil
}

View File

@ -5,6 +5,7 @@ import (
"io/ioutil"
"os"
"regexp"
"strings"
"gopkg.in/yaml.v2"
@ -13,11 +14,15 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
"github.com/projectdiscovery/nuclei/v2/pkg/templates/cache"
"github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
"github.com/projectdiscovery/nuclei/v2/pkg/utils"
"github.com/projectdiscovery/nuclei/v2/pkg/utils/stats"
)
const mandatoryFieldMissingTemplate = "mandatory '%s' field is missing"
const (
mandatoryFieldMissingTemplate = "mandatory '%s' field is missing"
invalidFieldFormatTemplate = "invalid field format for '%s' (allowed format is %s)"
)
// LoadTemplate returns true if the template is valid and matches the filtering criteria.
func LoadTemplate(templatePath string, tagFilter *filter.TagFilter, extraTags []string) (bool, error) {
@ -30,12 +35,12 @@ func LoadTemplate(templatePath string, tagFilter *filter.TagFilter, extraTags []
return false, nil
}
templateInfo := template.Info
if validationError := validateMandatoryInfoFields(&templateInfo); validationError != nil {
if validationError := validateTemplateFields(template); validationError != nil {
stats.Increment(SyntaxErrorStats)
return false, validationError
}
return isTemplateInfoMetadataMatch(tagFilter, &templateInfo, extraTags)
return isTemplateInfoMetadataMatch(tagFilter, &template.Info, extraTags, template.Type())
}
// LoadWorkflow returns true if the workflow is valid and matches the filtering criteria.
@ -45,10 +50,8 @@ func LoadWorkflow(templatePath string) (bool, error) {
return false, templateParseError
}
templateInfo := template.Info
if len(template.Workflows) > 0 {
if validationError := validateMandatoryInfoFields(&templateInfo); validationError != nil {
if validationError := validateTemplateFields(template); validationError != nil {
return false, validationError
}
return true, nil
@ -57,12 +60,12 @@ func LoadWorkflow(templatePath string) (bool, error) {
return false, nil
}
func isTemplateInfoMetadataMatch(tagFilter *filter.TagFilter, templateInfo *model.Info, extraTags []string) (bool, error) {
func isTemplateInfoMetadataMatch(tagFilter *filter.TagFilter, templateInfo *model.Info, extraTags []string, templateType types.ProtocolType) (bool, error) {
templateTags := templateInfo.Tags.ToSlice()
templateAuthors := templateInfo.Authors.ToSlice()
templateSeverity := templateInfo.SeverityHolder.Severity
match, err := tagFilter.Match(templateTags, templateAuthors, templateSeverity, extraTags)
match, err := tagFilter.Match(templateTags, templateAuthors, templateSeverity, extraTags, templateType)
if err == filter.ErrExcluded {
return false, filter.ErrExcluded
@ -71,18 +74,29 @@ func isTemplateInfoMetadataMatch(tagFilter *filter.TagFilter, templateInfo *mode
return match, err
}
func validateMandatoryInfoFields(info *model.Info) error {
if info == nil {
return fmt.Errorf(mandatoryFieldMissingTemplate, "info")
}
func validateTemplateFields(template *templates.Template) error {
info := template.Info
var errors []string
if utils.IsBlank(info.Name) {
return fmt.Errorf(mandatoryFieldMissingTemplate, "name")
errors = append(errors, fmt.Sprintf(mandatoryFieldMissingTemplate, "name"))
}
if info.Authors.IsEmpty() {
return fmt.Errorf(mandatoryFieldMissingTemplate, "author")
errors = append(errors, fmt.Sprintf(mandatoryFieldMissingTemplate, "author"))
}
if template.ID == "" {
errors = append(errors, fmt.Sprintf(mandatoryFieldMissingTemplate, "id"))
} else if !templateIDRegexp.MatchString(template.ID) {
errors = append(errors, fmt.Sprintf(invalidFieldFormatTemplate, "id", templateIDRegexp.String()))
}
if len(errors) > 0 {
return fmt.Errorf(strings.Join(errors, ", "))
}
return nil
}
@ -90,6 +104,7 @@ var (
parsedTemplatesCache *cache.Templates
ShouldValidate bool
fieldErrorRegexp = regexp.MustCompile(`not found in`)
templateIDRegexp = regexp.MustCompile(`^([a-zA-Z0-9]+[-_])*[a-zA-Z0-9]+$`)
)
const (

View File

@ -0,0 +1,110 @@
package parsers
import (
"errors"
"fmt"
"testing"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog/loader/filter"
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/stringslice"
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
"github.com/stretchr/testify/require"
)
func TestLoadTemplate(t *testing.T) {
origTemplatesCache := parsedTemplatesCache
defer func() { parsedTemplatesCache = origTemplatesCache }()
tt := []struct {
name string
template *templates.Template
templateErr error
expectedErr error
}{
{
name: "valid",
template: &templates.Template{
ID: "CVE-2021-27330",
Info: model.Info{
Name: "Valid template",
Authors: stringslice.StringSlice{Value: "Author"},
},
},
},
{
name: "emptyTemplate",
template: &templates.Template{},
expectedErr: errors.New("mandatory 'name' field is missing, mandatory 'author' field is missing, mandatory 'id' field is missing"),
},
{
name: "emptyNameWithInvalidID",
template: &templates.Template{
ID: "invalid id",
Info: model.Info{
Authors: stringslice.StringSlice{Value: "Author"},
},
},
expectedErr: errors.New("mandatory 'name' field is missing, invalid field format for 'id' (allowed format is ^([a-zA-Z0-9]+[-_])*[a-zA-Z0-9]+$)"),
},
}
for _, tc := range tt {
t.Run(tc.name, func(t *testing.T) {
parsedTemplatesCache.Store(tc.name, tc.template, tc.templateErr)
tagFilter := filter.New(&filter.Config{})
success, err := LoadTemplate(tc.name, tagFilter, nil)
if tc.expectedErr == nil {
require.NoError(t, err)
require.True(t, success)
} else {
require.Equal(t, tc.expectedErr, err)
require.False(t, success)
}
})
}
t.Run("invalidTemplateID", func(t *testing.T) {
tt := []struct {
id string
success bool
}{
{id: "A-B-C", success: true},
{id: "A-B-C-1", success: true},
{id: "CVE_2021_27330", success: true},
{id: "ABC DEF", success: false},
{id: "_-__AAA_", success: false},
{id: " CVE-2021-27330", success: false},
{id: "CVE-2021-27330 ", success: false},
{id: "CVE-2021-27330-", success: false},
{id: "-CVE-2021-27330-", success: false},
{id: "CVE-2021--27330", success: false},
{id: "CVE-2021+27330", success: false},
}
for i, tc := range tt {
name := fmt.Sprintf("regexp%d", i)
t.Run(name, func(t *testing.T) {
template := &templates.Template{
ID: tc.id,
Info: model.Info{
Name: "Valid template",
Authors: stringslice.StringSlice{Value: "Author"},
},
}
parsedTemplatesCache.Store(name, template, nil)
tagFilter := filter.New(&filter.Config{})
success, err := LoadTemplate(name, tagFilter, nil)
if tc.success {
require.NoError(t, err)
require.True(t, success)
} else {
require.Equal(t, errors.New("invalid field format for 'id' (allowed format is ^([a-zA-Z0-9]+[-_])*[a-zA-Z0-9]+$)"), err)
require.False(t, success)
}
})
}
})
}

View File

@ -18,7 +18,7 @@ func NewLoader(options *protocols.ExecuterOptions) (model.WorkflowLoader, error)
tagFilter := filter.New(&filter.Config{
Tags: options.Options.Tags,
ExcludeTags: options.Options.ExcludeTags,
Authors: options.Options.Author,
Authors: options.Options.Authors,
Severities: options.Options.Severities,
IncludeTags: options.Options.IncludeTags,
})

View File

@ -1,49 +0,0 @@
package clusterer
import (
"github.com/projectdiscovery/nuclei/v2/pkg/templates"
)
// Cluster clusters a list of templates into a lesser number if possible based
// on the similarity between the sent requests.
//
// If the attributes match, multiple requests can be clustered into a single
// request which saves time and network resources during execution.
func Cluster(list map[string]*templates.Template) [][]*templates.Template {
final := [][]*templates.Template{}
// Each protocol that can be clustered should be handled here.
for key, template := range list {
// We only cluster http requests as of now.
// Take care of requests that can't be clustered first.
if len(template.RequestsHTTP) == 0 {
delete(list, key)
final = append(final, []*templates.Template{template})
continue
}
delete(list, key) // delete element first so it's not found later.
// Find any/all similar matching request that is identical to
// this one and cluster them together for http protocol only.
if len(template.RequestsHTTP) == 1 {
cluster := []*templates.Template{}
for otherKey, other := range list {
if len(other.RequestsHTTP) == 0 {
continue
}
if template.RequestsHTTP[0].CanCluster(other.RequestsHTTP[0]) {
delete(list, otherKey)
cluster = append(cluster, other)
}
}
if len(cluster) > 0 {
cluster = append(cluster, template)
final = append(final, cluster)
continue
}
}
final = append(final, []*templates.Template{template})
}
return final
}

View File

@ -6,6 +6,7 @@ import (
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/helpers/writer"
)
// Executer executes a group of requests for a protocol
@ -47,8 +48,6 @@ func (e *Executer) Execute(input string) (bool, error) {
dynamicValues := make(map[string]interface{})
previous := make(map[string]interface{})
for _, req := range e.requests {
req := req
err := req.ExecuteWithResults(input, dynamicValues, previous, func(event *output.InternalWrappedEvent) {
ID := req.GetID()
if ID != "" {
@ -61,18 +60,17 @@ func (e *Executer) Execute(input string) (bool, error) {
builder.Reset()
}
}
if event.OperatorsResult == nil {
return
}
for _, result := range event.Results {
if e.options.IssuesClient != nil {
if err := e.options.IssuesClient.CreateIssue(result); err != nil {
gologger.Warning().Msgf("Could not create issue on tracker: %s", err)
}
// If no results were found, and also interactsh is not being used
// in that case we can skip it, otherwise we've to show failure in
// case of matcher-status flag.
if event.OperatorsResult == nil && !event.UsesInteractsh {
if err := e.options.Output.WriteFailure(event.InternalEvent); err != nil {
gologger.Warning().Msgf("Could not write failure event to output: %s\n", err)
}
} else {
if writer.WriteResult(event, e.options.Output, e.options.Progress, e.options.IssuesClient) {
results = true
_ = e.options.Output.Write(result)
e.options.Progress.IncrementMatched()
}
}
})
if err != nil {

View File

@ -10,7 +10,8 @@ var unresolvedVariablesRegex = regexp.MustCompile(`(?:%7[B|b]|\{){2}([^}]+)(?:%7
// ContainsUnresolvedVariables returns an error with variable names if the passed
// input contains unresolved {{<pattern-here>}} variables.
func ContainsUnresolvedVariables(data string) error {
func ContainsUnresolvedVariables(items ...string) error {
for _, data := range items {
matches := unresolvedVariablesRegex.FindAllStringSubmatch(data, -1)
if len(matches) == 0 {
return nil
@ -29,9 +30,13 @@ func ContainsUnresolvedVariables(data string) error {
}
errorMessage := errorString.String()
return errors.New(errorMessage)
}
return nil
}
func ContainsVariablesWithNames(data string, names map[string]interface{}) error {
func ContainsVariablesWithNames(names map[string]interface{}, items ...string) error {
for _, data := range items {
matches := unresolvedVariablesRegex.FindAllStringSubmatch(data, -1)
if len(matches) == 0 {
return nil
@ -53,4 +58,7 @@ func ContainsVariablesWithNames(data string, names map[string]interface{}) error
}
errorMessage := errorString.String()
return errors.New(errorMessage)
}
return nil
}

View File

@ -0,0 +1,96 @@
package generators
import (
"encoding/json"
"strings"
"github.com/alecthomas/jsonschema"
"github.com/pkg/errors"
)
// AttackType is the type of attack for payloads
type AttackType int
// Supported values for the ProtocolType
const (
// BatteringRamAttack replaces same payload into all of the defined payload positions at once.
BatteringRamAttack AttackType = iota + 1
// PitchForkAttack replaces variables with positional value from multiple wordlists
PitchForkAttack
// ClusterbombAttack replaces variables with all possible combinations of values
ClusterbombAttack
limit
)
// attackTypeMappings is a table for conversion of attack type from string.
var attackTypeMappings = map[AttackType]string{
BatteringRamAttack: "batteringram",
PitchForkAttack: "pitchfork",
ClusterbombAttack: "clusterbomb",
}
func GetSupportedAttackTypes() []AttackType {
var result []AttackType
for index := AttackType(1); index < limit; index++ {
result = append(result, index)
}
return result
}
func toAttackType(valueToMap string) (AttackType, error) {
normalizedValue := normalizeValue(valueToMap)
for key, currentValue := range attackTypeMappings {
if normalizedValue == currentValue {
return key, nil
}
}
return -1, errors.New("invalid attack type: " + valueToMap)
}
func normalizeValue(value string) string {
return strings.TrimSpace(strings.ToLower(value))
}
func (t AttackType) String() string {
return attackTypeMappings[t]
}
// AttackTypeHolder is used to hold internal type of the protocol
type AttackTypeHolder struct {
Value AttackType
}
func (holder AttackTypeHolder) JSONSchemaType() *jsonschema.Type {
gotType := &jsonschema.Type{
Type: "string",
Title: "type of the attack",
Description: "Type of the attack",
}
for _, types := range GetSupportedAttackTypes() {
gotType.Enum = append(gotType.Enum, types.String())
}
return gotType
}
func (holder *AttackTypeHolder) UnmarshalYAML(unmarshal func(interface{}) error) error {
var marshalledTypes string
if err := unmarshal(&marshalledTypes); err != nil {
return err
}
computedType, err := toAttackType(marshalledTypes)
if err != nil {
return err
}
holder.Value = computedType
return nil
}
func (holder *AttackTypeHolder) MarshalJSON() ([]byte, error) {
return json.Marshal(holder.Value.String())
}
func (holder AttackTypeHolder) MarshalYAML() (interface{}, error) {
return holder.Value.String(), nil
}

View File

@ -2,49 +2,53 @@
package generators
import "github.com/pkg/errors"
import (
"github.com/pkg/errors"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog"
)
// Generator is the generator struct for generating payloads
type Generator struct {
Type Type
// PayloadGenerator is the generator struct for generating payloads
type PayloadGenerator struct {
Type AttackType
payloads map[string][]string
}
// Type is type of attack
type Type int
const (
// Batteringram replaces same payload into all of the defined payload positions at once.
BatteringRam Type = iota + 1
// PitchFork replaces variables with positional value from multiple wordlists
PitchFork
// ClusterBomb replaces variables with all possible combinations of values
ClusterBomb
)
// StringToType is a table for conversion of attack type from string.
var StringToType = map[string]Type{
"batteringram": BatteringRam,
"pitchfork": PitchFork,
"clusterbomb": ClusterBomb,
}
// New creates a new generator structure for payload generation
func New(payloads map[string]interface{}, payloadType Type, templatePath string) (*Generator, error) {
generator := &Generator{}
func New(payloads map[string]interface{}, attackType AttackType, templatePath string, catalog *catalog.Catalog) (*PayloadGenerator, error) {
if attackType.String() == "" {
attackType = BatteringRamAttack
}
// Resolve payload paths if they are files.
payloadsFinal := make(map[string]interface{})
for name, payload := range payloads {
payloadsFinal[name] = payload
}
for name, payload := range payloads {
payloadStr, ok := payload.(string)
if ok {
final, resolveErr := catalog.ResolvePath(payloadStr, templatePath)
if resolveErr != nil {
return nil, errors.Wrap(resolveErr, "could not read payload file")
}
payloadsFinal[name] = final
}
}
generator := &PayloadGenerator{}
if err := generator.validate(payloads, templatePath); err != nil {
return nil, err
}
compiled, err := loadPayloads(payloads)
compiled, err := loadPayloads(payloadsFinal)
if err != nil {
return nil, err
}
generator.Type = payloadType
generator.Type = attackType
generator.payloads = compiled
// Validate the batteringram payload set
if payloadType == BatteringRam {
if attackType == BatteringRamAttack {
if len(payloads) != 1 {
return nil, errors.New("batteringram must have single payload set")
}
@ -54,7 +58,7 @@ func New(payloads map[string]interface{}, payloadType Type, templatePath string)
// Iterator is a single instance of an iterator for a generator structure
type Iterator struct {
Type Type
Type AttackType
position int
msbIterator int
total int
@ -62,7 +66,7 @@ type Iterator struct {
}
// NewIterator creates a new iterator for the payloads generator
func (g *Generator) NewIterator() *Iterator {
func (g *PayloadGenerator) NewIterator() *Iterator {
var payloads []*payloadIterator
for name, values := range g.payloads {
@ -95,18 +99,18 @@ func (i *Iterator) Remaining() int {
func (i *Iterator) Total() int {
count := 0
switch i.Type {
case BatteringRam:
case BatteringRamAttack:
for _, p := range i.payloads {
count += len(p.values)
}
case PitchFork:
case PitchForkAttack:
count = len(i.payloads[0].values)
for _, p := range i.payloads {
if count > len(p.values) {
count = len(p.values)
}
}
case ClusterBomb:
case ClusterbombAttack:
count = 1
for _, p := range i.payloads {
count *= len(p.values)
@ -118,11 +122,11 @@ func (i *Iterator) Total() int {
// Value returns the next value for an iterator
func (i *Iterator) Value() (map[string]interface{}, bool) {
switch i.Type {
case BatteringRam:
case BatteringRamAttack:
return i.batteringRamValue()
case PitchFork:
case PitchForkAttack:
return i.pitchforkValue()
case ClusterBomb:
case ClusterbombAttack:
return i.clusterbombValue()
default:
return i.batteringRamValue()

View File

@ -3,13 +3,15 @@ package generators
import (
"testing"
"github.com/projectdiscovery/nuclei/v2/pkg/catalog"
"github.com/stretchr/testify/require"
)
func TestBatteringRamGenerator(t *testing.T) {
usernames := []string{"admin", "password"}
generator, err := New(map[string]interface{}{"username": usernames}, BatteringRam, "")
catalogInstance := catalog.New("")
generator, err := New(map[string]interface{}{"username": usernames}, BatteringRamAttack, "", catalogInstance)
require.Nil(t, err, "could not create generator")
iterator := generator.NewIterator()
@ -28,7 +30,8 @@ func TestPitchforkGenerator(t *testing.T) {
usernames := []string{"admin", "token"}
passwords := []string{"password1", "password2", "password3"}
generator, err := New(map[string]interface{}{"username": usernames, "password": passwords}, PitchFork, "")
catalogInstance := catalog.New("")
generator, err := New(map[string]interface{}{"username": usernames, "password": passwords}, PitchForkAttack, "", catalogInstance)
require.Nil(t, err, "could not create generator")
iterator := generator.NewIterator()
@ -49,7 +52,8 @@ func TestClusterbombGenerator(t *testing.T) {
usernames := []string{"admin"}
passwords := []string{"admin", "password", "token"}
generator, err := New(map[string]interface{}{"username": usernames, "password": passwords}, ClusterBomb, "")
catalogInstance := catalog.New("")
generator, err := New(map[string]interface{}{"username": usernames, "password": passwords}, ClusterbombAttack, "", catalogInstance)
require.Nil(t, err, "could not create generator")
iterator := generator.NewIterator()

View File

@ -11,7 +11,7 @@ import (
)
// validate validates the payloads if any.
func (g *Generator) validate(payloads map[string]interface{}, templatePath string) error {
func (g *PayloadGenerator) validate(payloads map[string]interface{}, templatePath string) error {
for name, payload := range payloads {
switch pt := payload.(type) {
case string:

View File

@ -48,15 +48,19 @@ func gadgetEncodingHelper(returnData []byte, encoding string) string {
return hex.EncodeToString(returnData)
case "gzip":
buffer := &bytes.Buffer{}
if _, err := gzip.NewWriter(buffer).Write(returnData); err != nil {
writer := gzip.NewWriter(buffer)
if _, err := writer.Write(returnData); err != nil {
return ""
}
_ = writer.Close()
return buffer.String()
case "gzip-base64":
buffer := &bytes.Buffer{}
if _, err := gzip.NewWriter(buffer).Write(returnData); err != nil {
writer := gzip.NewWriter(buffer)
if _, err := writer.Write(returnData); err != nil {
return ""
}
_ = writer.Close()
return urlsafeBase64Encode(buffer.Bytes())
case "base64-raw":
return base64.StdEncoding.EncodeToString(returnData)

View File

@ -0,0 +1,130 @@
package responsehighlighter
import (
"errors"
"fmt"
"regexp"
"strings"
"unicode"
"github.com/projectdiscovery/gologger"
)
// [0-9a-fA-F]{8} {2} - hexdump indexes (8 character hex value followed by two spaces)
// [0-9a-fA-F]{2} + - 2 character long hex values followed by one or two space (potentially wrapped with an ASCII color code, see below)
// \x1b\[(\d;?)+m - ASCII color code pattern
// \x1b\[0m - ASCII color code reset
// \|(.*)\|\n - ASCII representation of the input delimited by pipe characters
var hexDumpParsePattern = regexp.MustCompile(`([0-9a-fA-F]{8} {2})((?:(?:\x1b\[(?:\d;?)+m)?[0-9a-fA-F]{2}(?:\x1b\[0m)? +)+)\|(.*)\|\n`)
var hexValuePattern = regexp.MustCompile(`([a-fA-F0-9]{2})`)
type HighlightableHexDump struct {
index []string
hex []string
ascii []string
}
func NewHighlightableHexDump(rowSize int) HighlightableHexDump {
return HighlightableHexDump{index: make([]string, 0, rowSize), hex: make([]string, 0, rowSize), ascii: make([]string, 0, rowSize)}
}
func (hexDump HighlightableHexDump) len() int {
return len(hexDump.index)
}
func (hexDump HighlightableHexDump) String() string {
var result string
for i := 0; i < hexDump.len(); i++ {
result += hexDump.index[i] + hexDump.hex[i] + "|" + hexDump.ascii[i] + "|\n"
}
return result
}
func toHighLightedHexDump(hexDump, snippetToHighlight string) (HighlightableHexDump, error) {
hexDumpRowValues := hexDumpParsePattern.FindAllStringSubmatch(hexDump, -1)
if hexDumpRowValues == nil || len(hexDumpRowValues) != strings.Count(hexDump, "\n") {
message := "could not parse hexdump"
gologger.Warning().Msgf(message)
return HighlightableHexDump{}, errors.New(message)
}
result := NewHighlightableHexDump(len(hexDumpRowValues))
for _, currentHexDumpRowValues := range hexDumpRowValues {
result.index = append(result.index, currentHexDumpRowValues[1])
result.hex = append(result.hex, currentHexDumpRowValues[2])
result.ascii = append(result.ascii, currentHexDumpRowValues[3])
}
return result.highlight(snippetToHighlight), nil
}
func (hexDump HighlightableHexDump) highlight(snippetToColor string) HighlightableHexDump {
return highlightAsciiSection(highlightHexSection(hexDump, snippetToColor), snippetToColor)
}
func highlightHexSection(hexDump HighlightableHexDump, snippetToColor string) HighlightableHexDump {
var snippetHexCharactersMatchPattern string
for _, char := range snippetToColor {
snippetHexCharactersMatchPattern += fmt.Sprintf(`(%02x[ \n]+)`, char)
}
hexDump.hex = highlight(hexDump.hex, snippetHexCharactersMatchPattern, func(v string) string {
return hexValuePattern.ReplaceAllString(v, addColor("$1"))
})
return hexDump
}
func highlightAsciiSection(hexDump HighlightableHexDump, snippetToColor string) HighlightableHexDump {
var snippetCharactersMatchPattern string
for _, v := range snippetToColor {
var value string
if IsASCIIPrintable(v) {
value = regexp.QuoteMeta(string(v))
} else {
value = "."
}
snippetCharactersMatchPattern += fmt.Sprintf(`(%s\n*)`, value)
}
hexDump.ascii = highlight(hexDump.ascii, snippetCharactersMatchPattern, func(v string) string {
if len(v) > 1 {
return addColor(string(v[0])) + v[1:] // do not color new line characters
}
return addColor(v)
})
return hexDump
}
func highlight(values []string, snippetCharactersMatchPattern string, replaceToFunc func(v string) string) []string {
rows := strings.Join(values, "\n")
compiledPattern := regexp.MustCompile(snippetCharactersMatchPattern)
for _, submatch := range compiledPattern.FindAllStringSubmatch(rows, -1) {
var replaceTo string
var replaceFrom string
for _, matchedValueWithSuffix := range submatch[1:] {
replaceFrom += matchedValueWithSuffix
replaceTo += replaceToFunc(matchedValueWithSuffix)
}
rows = strings.ReplaceAll(rows, replaceFrom, replaceTo)
}
return strings.Split(rows, "\n")
}
func HasBinaryContent(input string) bool {
return !IsASCII(input)
}
// IsASCII tests whether a string consists only of ASCII characters or not
func IsASCII(input string) bool {
for i := 0; i < len(input); i++ {
if input[i] > unicode.MaxASCII {
return false
}
}
return true
}
func IsASCIIPrintable(input rune) bool {
return input > 32 && input < unicode.MaxASCII
}

View File

@ -1,6 +1,7 @@
package responsehighlighter
import (
"sort"
"strconv"
"strings"
@ -9,16 +10,19 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
)
var colorizer = aurora.NewAurora(true)
var colorFunction = aurora.Green
func Highlight(operatorResult *operators.Result, response string, noColor bool) string {
func Highlight(operatorResult *operators.Result, response string, noColor, hexDump bool) string {
result := response
if operatorResult != nil && !noColor {
for _, matches := range operatorResult.Matches {
if len(matches) > 0 {
for _, currentMatch := range matches {
result = strings.ReplaceAll(result, currentMatch, colorizer.Green(currentMatch).String())
for _, currentMatch := range getSortedMatches(operatorResult) {
if hexDump {
highlightedHexDump, err := toHighLightedHexDump(result, currentMatch)
if err == nil {
result = highlightedHexDump.String()
}
} else {
result = highlightASCII(currentMatch, result)
}
}
}
@ -26,6 +30,27 @@ func Highlight(operatorResult *operators.Result, response string, noColor bool)
return result
}
func highlightASCII(currentMatch string, result string) string {
var coloredMatchBuilder strings.Builder
for _, char := range currentMatch {
coloredMatchBuilder.WriteString(addColor(string(char)))
}
return strings.ReplaceAll(result, currentMatch, coloredMatchBuilder.String())
}
func getSortedMatches(operatorResult *operators.Result) []string {
sortedMatches := make([]string, 0, len(operatorResult.Matches))
for _, matches := range operatorResult.Matches {
sortedMatches = append(sortedMatches, matches...)
}
sort.Slice(sortedMatches, func(i, j int) bool {
return len(sortedMatches[i]) > len(sortedMatches[j])
})
return sortedMatches
}
func CreateStatusCodeSnippet(response string, statusCode int) string {
if strings.HasPrefix(response, "HTTP/") {
strStatusCode := strconv.Itoa(statusCode)
@ -33,3 +58,7 @@ func CreateStatusCodeSnippet(response string, statusCode int) string {
}
return ""
}
func addColor(value string) string {
return colorFunction(value).String()
}

View File

@ -0,0 +1,111 @@
package responsehighlighter
import (
"encoding/hex"
"testing"
"github.com/stretchr/testify/assert"
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
)
const input = "abcdefghijklmnabcdefghijklmnabcdefghijklmnabcdefghijklmnabcdefghijklmnabcdefghijklmnabcdefghijklmnabcdefghijklmnabcdefghijklmn"
func TestHexDumpHighlighting(t *testing.T) {
highlightedHexDumpResponse :=
"00000000 61 62 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e 61 62 |abc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmnab|\n" +
"00000010 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e 61 62 63 \x1b[32m64\x1b[0m |c\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmnabc\x1b[32md\x1b[0m|\n" +
"00000020 \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e 61 62 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m |\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmnabc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m|\n" +
"00000030 \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e 61 62 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m |\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmnabc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m|\n" +
"00000040 \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e 61 62 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m |\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmnabc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0m|\n" +
"00000050 6b 6c 6d 6e 61 62 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c |klmnabc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mkl|\n" +
"00000060 6d 6e 61 62 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e |mnabc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn|\n" +
"00000070 61 62 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e |abc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn|\n"
t.Run("Test highlighting when the snippet is wrapped", func(t *testing.T) {
result, err := toHighLightedHexDump(hex.Dump([]byte(input)), "defghij")
assert.Nil(t, err)
assert.Equal(t, highlightedHexDumpResponse, result.String())
})
t.Run("Test highlight when the snippet contains separator character", func(t *testing.T) {
value := "asdfasdfasda|basdfadsdfs|"
result, err := toHighLightedHexDump(hex.Dump([]byte(value)), "a|b")
expected :=
"00000000 61 73 64 66 61 73 64 66 61 73 64 \x1b[32m61\x1b[0m \x1b[32m7c\x1b[0m \x1b[32m62\x1b[0m 61 73 |asdfasdfasd\x1b[32ma\x1b[0m\x1b[32m|\x1b[0m\x1b[32mb\x1b[0mas|\n" +
"00000010 64 66 61 64 73 64 66 73 7c |dfadsdfs||\n"
assert.Nil(t, err)
assert.Equal(t, expected, result.String())
})
}
func TestHighlight(t *testing.T) {
const multiSnippetHighlightHexDumpResponse = "00000000 \x1b[32m61\x1b[0m \x1b[32m62\x1b[0m 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e \x1b[32m61\x1b[0m \x1b[32m62\x1b[0m |\x1b[32ma\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32ma\x1b[0m\x1b[32mb\x1b[0m|\n" +
"00000010 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e \x1b[32m61\x1b[0m \x1b[32m62\x1b[0m 63 \x1b[32m64\x1b[0m |c\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32ma\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m|\n" +
"00000020 \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e \x1b[32m61\x1b[0m \x1b[32m62\x1b[0m 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m |\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32ma\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m|\n" +
"00000030 \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e \x1b[32m61\x1b[0m \x1b[32m62\x1b[0m 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m |\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32ma\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m|\n" +
"00000040 \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e \x1b[32m61\x1b[0m \x1b[32m62\x1b[0m 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m |\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32ma\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0m|\n" +
"00000050 6b 6c 6d 6e \x1b[32m61\x1b[0m \x1b[32m62\x1b[0m 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c |klmn\x1b[32ma\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mkl|\n" +
"00000060 6d 6e \x1b[32m61\x1b[0m \x1b[32m62\x1b[0m 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e |mn\x1b[32ma\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn|\n" +
"00000070 \x1b[32m61\x1b[0m \x1b[32m62\x1b[0m 63 \x1b[32m64\x1b[0m \x1b[32m65\x1b[0m \x1b[32m66\x1b[0m \x1b[32m67\x1b[0m \x1b[32m68\x1b[0m \x1b[32m69\x1b[0m \x1b[32m6a\x1b[0m 6b 6c 6d 6e |\x1b[32ma\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn|\n"
matches := map[string][]string{
"first": {"defghij"},
"second": {"ab"},
}
operatorResult := operators.Result{Matches: matches}
t.Run("Test highlighting when the snippet is wrapped", func(t *testing.T) {
result := Highlight(&operatorResult, hex.Dump([]byte(input)), false, true)
assert.Equal(t, multiSnippetHighlightHexDumpResponse, result)
})
t.Run("Test highlighting without hexdump", func(t *testing.T) {
result := Highlight(&operatorResult, input, false, false)
expected :=
"\x1b[32ma\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32m" +
"a\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32m" +
"a\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32m" +
"a\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32m" +
"a\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32m" +
"a\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32m" +
"a\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32m" +
"a\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn\x1b[32m" +
"a\x1b[0m\x1b[32mb\x1b[0mc\x1b[32md\x1b[0m\x1b[32me\x1b[0m\x1b[32mf\x1b[0m\x1b[32mg\x1b[0m\x1b[32mh\x1b[0m\x1b[32mi\x1b[0m\x1b[32mj\x1b[0mklmn"
print(result)
assert.Equal(t, expected, result)
})
t.Run("Test the response is not modified if noColor is true", func(t *testing.T) {
result := Highlight(&operatorResult, input, true, false)
assert.Equal(t, input, result)
})
t.Run("Test the response is not modified if noColor is true", func(t *testing.T) {
result := Highlight(&operatorResult, hex.Dump([]byte(input)), true, true)
assert.Equal(t, hex.Dump([]byte(input)), result)
})
}
func TestMultiSubstringMatchHighlight(t *testing.T) {
const input = `
start ValueToMatch end
start ValueToMatch-1.2.3 end
start ValueToMatch-2.1 end
`
matches := map[string][]string{
"first": {"ValueToMatch"},
"second": {"ValueToMatch-1.2.3"},
"third": {"ValueToMatch-2.1"},
}
operatorResult := operators.Result{Matches: matches}
expected :=
"\nstart \x1b[32mV\x1b[0m\x1b[32ma\x1b[0m\x1b[32ml\x1b[0m\x1b[32mu\x1b[0m\x1b[32me\x1b[0m\x1b[32mT\x1b[0m\x1b[32mo\x1b[0m\x1b[32mM\x1b[0m\x1b[32ma\x1b[0m\x1b[32mt\x1b[0m\x1b[32mc\x1b[0m\x1b[32mh\x1b[0m end\n" +
"start \x1b[32mV\x1b[0m\x1b[32ma\x1b[0m\x1b[32ml\x1b[0m\x1b[32mu\x1b[0m\x1b[32me\x1b[0m\x1b[32mT\x1b[0m\x1b[32mo\x1b[0m\x1b[32mM\x1b[0m\x1b[32ma\x1b[0m\x1b[32mt\x1b[0m\x1b[32mc\x1b[0m\x1b[32mh\x1b[0m\x1b[32m-\x1b[0m\x1b[32m1\x1b[0m\x1b[32m.\x1b[0m\x1b[32m2\x1b[0m\x1b[32m.\x1b[0m\x1b[32m3\x1b[0m end\n" +
"start \x1b[32mV\x1b[0m\x1b[32ma\x1b[0m\x1b[32ml\x1b[0m\x1b[32mu\x1b[0m\x1b[32me\x1b[0m\x1b[32mT\x1b[0m\x1b[32mo\x1b[0m\x1b[32mM\x1b[0m\x1b[32ma\x1b[0m\x1b[32mt\x1b[0m\x1b[32mc\x1b[0m\x1b[32mh\x1b[0m\x1b[32m-\x1b[0m\x1b[32m2\x1b[0m\x1b[32m.\x1b[0m\x1b[32m1\x1b[0m end \n"
result := Highlight(&operatorResult, input, false, false)
assert.Equal(t, expected, result)
}

View File

@ -0,0 +1,35 @@
package writer
import (
"github.com/projectdiscovery/gologger"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/progress"
"github.com/projectdiscovery/nuclei/v2/pkg/reporting"
)
// WriteResult is a helper for writing results to the output
func WriteResult(data *output.InternalWrappedEvent, output output.Writer, progress progress.Progress, issuesClient *reporting.Client) bool {
// Handle the case where no result found for the template.
// In this case, we just show misc information about the failed
// match for the template.
if data.OperatorsResult == nil {
return false
}
var matched bool
for _, result := range data.Results {
if err := output.Write(result); err != nil {
gologger.Warning().Msgf("Could not write output event: %s\n", err)
}
if !matched {
matched = true
}
progress.IncrementMatched()
if issuesClient != nil {
if err := issuesClient.CreateIssue(result); err != nil {
gologger.Warning().Msgf("Could not create issue on tracker: %s", err)
}
}
}
return matched
}

View File

@ -19,6 +19,7 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/progress"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/helpers/writer"
"github.com/projectdiscovery/nuclei/v2/pkg/reporting"
)
@ -72,6 +73,8 @@ type Options struct {
Progress progress.Progress
// Debug specifies whether debugging output should be shown for interactsh-client
Debug bool
NoInteractsh bool
}
const defaultMaxInteractionsCount = 5000
@ -103,7 +106,24 @@ func New(options *Options) (*Client, error) {
return interactClient, nil
}
// NewDefaultOptions returns the default options for interactsh client
func NewDefaultOptions(output output.Writer, reporting *reporting.Client, progress progress.Progress) *Options {
return &Options{
ServerURL: "https://interactsh.com",
CacheSize: 5000,
Eviction: 60 * time.Second,
ColldownPeriod: 5 * time.Second,
PollDuration: 5 * time.Second,
Output: output,
IssuesClient: reporting,
Progress: progress,
}
}
func (c *Client) firstTimeInitializeClient() error {
if c.options.NoInteractsh {
return nil // do not init if disabled
}
interactsh, err := client.New(&client.Options{
ServerURL: c.options.ServerURL,
Token: c.options.Authorization,
@ -158,20 +178,9 @@ func (c *Client) processInteractionForRequest(interaction *server.Interaction, d
}
data.Event.Results = data.MakeResultFunc(data.Event)
for _, result := range data.Event.Results {
result.Interaction = interaction
_ = c.options.Output.Write(result)
if !c.matched {
if writer.WriteResult(data.Event, c.options.Output, c.options.Progress, c.options.IssuesClient) {
c.matched = true
}
c.options.Progress.IncrementMatched()
if c.options.IssuesClient != nil {
if err := c.options.IssuesClient.CreateIssue(result); err != nil {
gologger.Warning().Msgf("Could not create issue on tracker: %s", err)
}
}
}
return true
}
@ -206,12 +215,13 @@ func (c *Client) Close() bool {
//
// It accepts data to replace as well as the URL to replace placeholders
// with generated uniquely for each request.
func (c *Client) ReplaceMarkers(data, interactshURL string) string {
if !strings.Contains(data, interactshURLMarker) {
return data
func (c *Client) ReplaceMarkers(data string, interactshURLs []string) (string, []string) {
for strings.Contains(data, interactshURLMarker) {
url := c.URL()
interactshURLs = append(interactshURLs, url)
data = strings.Replace(data, interactshURLMarker, url, 1)
}
replaced := strings.NewReplacer("{{interactsh-url}}", interactshURL).Replace(data)
return replaced
return data, interactshURLs
}
// MakeResultEventFunc is a result making function for nuclei
@ -227,7 +237,8 @@ type RequestData struct {
}
// RequestEvent is the event for a network request sent by nuclei.
func (c *Client) RequestEvent(interactshURL string, data *RequestData) {
func (c *Client) RequestEvent(interactshURLs []string, data *RequestData) {
for _, interactshURL := range interactshURLs {
id := strings.TrimSuffix(interactshURL, c.dotHostname)
interaction := c.interactions.Get(id)
@ -238,19 +249,17 @@ func (c *Client) RequestEvent(interactshURL string, data *RequestData) {
c.requests.Set(id, data, c.eviction)
return
}
matched := false
for _, interaction := range interactions {
if c.processInteractionForRequest(interaction, data) {
matched = true
c.interactions.Delete(id)
break
}
}
if matched {
c.interactions.Delete(id)
}
} else {
c.requests.Set(id, data, c.eviction)
}
}
}
// HasMatchers returns true if an operator has interactsh part

View File

@ -18,6 +18,7 @@ func Init(options *types.Options) error {
if options.ResolversFile != "" {
opts.BaseResolvers = options.InternalResolversList
}
opts.WithDialerHistory = true
dialer, err := fastdialer.NewDialer(opts)
if err != nil {
return errors.Wrap(err, "could not create dialer")

View File

@ -7,8 +7,11 @@ import (
"github.com/miekg/dns"
"github.com/pkg/errors"
"github.com/weppos/publicsuffix-go/publicsuffix"
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/expressions"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/replacer"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/dns/dnsclientpool"
"github.com/projectdiscovery/retryabledns"
@ -30,7 +33,7 @@ type Request struct {
// - value: "\"{{FQDN}}\""
Name string `yaml:"name,omitempty" jsonschema:"title=hostname to make dns request for,description=Name is the Hostname to make DNS request for"`
// description: |
// Type is the type of DNS request to make.
// RequestType is the type of DNS request to make.
// values:
// - "A"
// - "NS"
@ -41,7 +44,7 @@ type Request struct {
// - "MX"
// - "TXT"
// - "AAAA"
Type string `yaml:"type,omitempty" jsonschema:"title=type of dns request to make,description=Type is the type of DNS request to make,enum=A,enum=NS,enum=DS,enum=CNAME,enum=SOA,enum=PTR,enum=MX,enum=TXT,enum=AAAA"`
RequestType DNSRequestTypeHolder `yaml:"type,omitempty" jsonschema:"title=type of dns request to make,description=Type is the type of DNS request to make,enum=A,enum=NS,enum=DS,enum=CNAME,enum=SOA,enum=PTR,enum=MX,enum=TXT,enum=AAAA"`
// description: |
// Class is the class of the DNS request.
//
@ -60,6 +63,15 @@ type Request struct {
// - name: Use a retry of 3 to 5 generally
// value: 5
Retries int `yaml:"retries,omitempty" jsonschema:"title=retries for dns request,description=Retries is the number of retries for the DNS request"`
// description: |
// Trace performs a trace operation for the target.
Trace bool `yaml:"trace,omitempty" jsonschema:"title=trace operation,description=Trace performs a trace operation for the target."`
// description: |
// TraceMaxRecursion is the number of max recursion allowed for trace operations
// examples:
// - name: Use a retry of 100 to 150 generally
// value: 100
TraceMaxRecursion int `yaml:"trace-max-recursion,omitempty" jsonschema:"title=trace-max-recursion level for dns request,description=TraceMaxRecursion is the number of max recursion allowed for trace operations"`
CompiledOperators *operators.Operators `yaml:"-"`
dnsClient *retryabledns.Client
@ -94,7 +106,7 @@ func (request *Request) Compile(options *protocols.ExecuterOptions) error {
dnsClientOptions.Resolvers = request.Resolvers
}
// Create a dns client for the class
client, err := dnsclientpool.Get(options.Options, dnsClientOptions)
client, err := request.getDnsClient(options, nil)
if err != nil {
return errors.Wrap(err, "could not get dns client")
}
@ -109,10 +121,32 @@ func (request *Request) Compile(options *protocols.ExecuterOptions) error {
}
request.class = classToInt(request.Class)
request.options = options
request.question = questionTypeToInt(request.Type)
request.question = questionTypeToInt(request.RequestType.String())
return nil
}
func (request *Request) getDnsClient(options *protocols.ExecuterOptions, metadata map[string]interface{}) (*retryabledns.Client, error) {
dnsClientOptions := &dnsclientpool.Configuration{
Retries: request.Retries,
}
if len(request.Resolvers) > 0 {
if len(request.Resolvers) > 0 {
for _, resolver := range request.Resolvers {
if expressions.ContainsUnresolvedVariables(resolver) != nil {
var err error
resolver, err = expressions.Evaluate(resolver, metadata)
if err != nil {
return nil, errors.Wrap(err, "could not resolve resolvers expressions")
}
dnsClientOptions.Resolvers = append(dnsClientOptions.Resolvers, resolver)
}
}
}
dnsClientOptions.Resolvers = request.Resolvers
}
return dnsclientpool.Get(options.Options, dnsClientOptions)
}
// Requests returns the total number of requests the YAML rule will perform
func (request *Request) Requests() int {
return 1
@ -132,7 +166,7 @@ func (request *Request) Make(domain string) (*dns.Msg, error) {
var q dns.Question
final := replacer.Replace(request.Name, map[string]interface{}{"FQDN": domain})
final := replacer.Replace(request.Name, generateDNSVariables(domain))
q.Name = dns.Fqdn(final)
q.Qclass = request.class
@ -198,3 +232,19 @@ func classToInt(class string) uint16 {
}
return uint16(result)
}
func generateDNSVariables(domain string) map[string]interface{} {
parsed, err := publicsuffix.Parse(strings.TrimSuffix(domain, "."))
if err != nil {
return map[string]interface{}{"FQDN": domain}
}
domainName := strings.Join([]string{parsed.SLD, parsed.TLD}, ".")
return map[string]interface{}{
"FQDN": domain,
"RDN": domainName,
"DN": parsed.SLD,
"TLD": parsed.TLD,
"SD": parsed.TRD,
}
}

View File

@ -5,18 +5,29 @@ import (
"github.com/stretchr/testify/require"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
func TestGenerateDNSVariables(t *testing.T) {
vars := generateDNSVariables("www.projectdiscovery.io")
require.Equal(t, map[string]interface{}{
"FQDN": "www.projectdiscovery.io",
"RDN": "projectdiscovery.io",
"DN": "projectdiscovery",
"TLD": "io",
"SD": "www",
}, vars, "could not get dns variables")
}
func TestDNSCompileMake(t *testing.T) {
options := testutils.DefaultOptions
testutils.Init(options)
const templateID = "testing-dns"
request := &Request{
Type: "A",
RequestType: DNSRequestTypeHolder{DNSRequestType: A},
Class: "INET",
Retries: 5,
ID: templateID,

View File

@ -0,0 +1,110 @@
package dns
import (
"encoding/json"
"errors"
"strings"
"github.com/alecthomas/jsonschema"
)
// DNSRequestType is the type of the method specified
type DNSRequestType int
const (
A DNSRequestType = iota + 1
NS
DS
CNAME
SOA
PTR
MX
TXT
AAAA
//limit
limit
)
// DNSRequestTypeMapping is a table for conversion of method from string.
var DNSRequestTypeMapping = map[DNSRequestType]string{
A: "A",
NS: "NS",
DS: "DS",
CNAME: "CNAME",
SOA: "SOA",
PTR: "PTR",
MX: "MX",
TXT: "TXT",
AAAA: "AAAA",
}
// GetSupportedDNSRequestTypes returns list of supported types
func GetSupportedDNSRequestTypes() []DNSRequestType {
var result []DNSRequestType
for index := DNSRequestType(1); index < limit; index++ {
result = append(result, index)
}
return result
}
func toDNSRequestTypes(valueToMap string) (DNSRequestType, error) {
normalizedValue := normalizeValue(valueToMap)
for key, currentValue := range DNSRequestTypeMapping {
if normalizedValue == currentValue {
return key, nil
}
}
return -1, errors.New("Invalid DNS request type: " + valueToMap)
}
func normalizeValue(value string) string {
return strings.TrimSpace(strings.ToUpper(value))
}
func (t DNSRequestType) String() string {
return DNSRequestTypeMapping[t]
}
// DNSRequestTypeHolder is used to hold internal type of the DNS type
type DNSRequestTypeHolder struct {
DNSRequestType DNSRequestType
}
func (holder DNSRequestTypeHolder) String() string {
return holder.DNSRequestType.String()
}
func (holder DNSRequestTypeHolder) JSONSchemaType() *jsonschema.Type {
gotType := &jsonschema.Type{
Type: "string",
Title: "type of DNS request to make",
Description: "Type is the type of DNS request to make,enum=A,enum=NS,enum=DS,enum=CNAME,enum=SOA,enum=PTR,enum=MX,enum=TXT,enum=AAAA",
}
for _, types := range GetSupportedDNSRequestTypes() {
gotType.Enum = append(gotType.Enum, types.String())
}
return gotType
}
func (holder *DNSRequestTypeHolder) UnmarshalYAML(unmarshal func(interface{}) error) error {
var marshalledTypes string
if err := unmarshal(&marshalledTypes); err != nil {
return err
}
computedType, err := toDNSRequestTypes(marshalledTypes)
if err != nil {
return err
}
holder.DNSRequestType = computedType
return nil
}
func (holder *DNSRequestTypeHolder) MarshalJSON() ([]byte, error) {
return json.Marshal(holder.DNSRequestType.String())
}
func (holder DNSRequestTypeHolder) MarshalYAML() (interface{}, error) {
return holder.DNSRequestType.String(), nil
}

View File

@ -2,6 +2,8 @@ package dns
import (
"bytes"
"fmt"
"strings"
"time"
"github.com/miekg/dns"
@ -12,17 +14,12 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols"
"github.com/projectdiscovery/nuclei/v2/pkg/types"
"github.com/projectdiscovery/retryabledns"
)
// Match matches a generic data response again a given matcher
// Match matches a generic data response against a given matcher
func (request *Request) Match(data map[string]interface{}, matcher *matchers.Matcher) (bool, []string) {
partString := matcher.Part
switch partString {
case "body", "all", "":
partString = "raw"
}
item, ok := data[partString]
item, ok := request.getMatchPart(matcher.Part, data)
if !ok {
return false, []string{}
}
@ -50,29 +47,36 @@ func (request *Request) Match(data map[string]interface{}, matcher *matchers.Mat
// Extract performs extracting operation for an extractor on model and returns true or false.
func (request *Request) Extract(data map[string]interface{}, extractor *extractors.Extractor) map[string]struct{} {
part := extractor.Part
switch part {
case "body", "all":
part = "raw"
}
item, ok := data[part]
item, ok := request.getMatchPart(extractor.Part, data)
if !ok {
return nil
}
itemStr := types.ToString(item)
switch extractor.GetType() {
case extractors.RegexExtractor:
return extractor.ExtractRegex(itemStr)
return extractor.ExtractRegex(types.ToString(item))
case extractors.KValExtractor:
return extractor.ExtractKval(data)
}
return nil
}
func (request *Request) getMatchPart(part string, data output.InternalEvent) (interface{}, bool) {
switch part {
case "body", "all", "":
part = "raw"
}
item, ok := data[part]
if !ok {
return "", false
}
return item, true
}
// responseToDSLMap converts a DNS response to a map for use in DSL matching
func (request *Request) responseToDSLMap(req, resp *dns.Msg, host, matched string) output.InternalEvent {
func (request *Request) responseToDSLMap(req, resp *dns.Msg, host, matched string, tracedata *retryabledns.TraceData) output.InternalEvent {
return output.InternalEvent{
"host": host,
"matched": matched,
@ -86,6 +90,8 @@ func (request *Request) responseToDSLMap(req, resp *dns.Msg, host, matched strin
"template-id": request.options.TemplateID,
"template-info": request.options.TemplateInfo,
"template-path": request.options.TemplatePath,
"type": request.Type().String(),
"trace": traceToString(tracedata, false),
}
}
@ -99,10 +105,11 @@ func (request *Request) MakeResultEventItem(wrapped *output.InternalWrappedEvent
TemplateID: types.ToString(wrapped.InternalEvent["template-id"]),
TemplatePath: types.ToString(wrapped.InternalEvent["template-path"]),
Info: wrapped.InternalEvent["template-info"].(model.Info),
Type: "dns",
Type: types.ToString(wrapped.InternalEvent["type"]),
Host: types.ToString(wrapped.InternalEvent["host"]),
Matched: types.ToString(wrapped.InternalEvent["matched"]),
ExtractedResults: wrapped.OperatorsResult.OutputExtracts,
MatcherStatus: true,
Timestamp: time.Now(),
Request: types.ToString(wrapped.InternalEvent["request"]),
Response: types.ToString(wrapped.InternalEvent["raw"]),
@ -125,3 +132,16 @@ func questionToString(resourceRecords []dns.Question) string {
}
return buffer.String()
}
func traceToString(tracedata *retryabledns.TraceData, withSteps bool) string {
buffer := &bytes.Buffer{}
if tracedata != nil {
for i, dnsRecord := range tracedata.DNSData {
if withSteps {
buffer.WriteString(fmt.Sprintf("request %d to resolver %s:\n", i, strings.Join(dnsRecord.Resolver, ",")))
}
buffer.WriteString(dnsRecord.Raw)
}
}
return buffer.String()
}

View File

@ -8,13 +8,13 @@ import (
"github.com/miekg/dns"
"github.com/stretchr/testify/require"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
"github.com/projectdiscovery/nuclei/v2/pkg/operators/extractors"
"github.com/projectdiscovery/nuclei/v2/pkg/operators/matchers"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
func TestResponseToDSLMap(t *testing.T) {
@ -23,7 +23,7 @@ func TestResponseToDSLMap(t *testing.T) {
testutils.Init(options)
templateID := "testing-dns"
request := &Request{
Type: "A",
RequestType: DNSRequestTypeHolder{DNSRequestType: A},
Class: "INET",
Retries: 5,
ID: templateID,
@ -44,8 +44,8 @@ func TestResponseToDSLMap(t *testing.T) {
resp.Rcode = dns.RcodeSuccess
resp.Answer = append(resp.Answer, &dns.A{A: net.ParseIP("1.1.1.1"), Hdr: dns.RR_Header{Name: "one.one.one.one."}})
event := request.responseToDSLMap(req, resp, "one.one.one.one", "one.one.one.one")
require.Len(t, event, 12, "could not get correct number of items in dsl map")
event := request.responseToDSLMap(req, resp, "one.one.one.one", "one.one.one.one", nil)
require.Len(t, event, 14, "could not get correct number of items in dsl map")
require.Equal(t, dns.RcodeSuccess, event["rcode"], "could not get correct rcode")
}
@ -55,7 +55,7 @@ func TestDNSOperatorMatch(t *testing.T) {
testutils.Init(options)
templateID := "testing-dns"
request := &Request{
Type: "A",
RequestType: DNSRequestTypeHolder{DNSRequestType: A},
Class: "INET",
Retries: 5,
ID: templateID,
@ -76,12 +76,12 @@ func TestDNSOperatorMatch(t *testing.T) {
resp.Rcode = dns.RcodeSuccess
resp.Answer = append(resp.Answer, &dns.A{A: net.ParseIP("1.1.1.1"), Hdr: dns.RR_Header{Name: "one.one.one.one."}})
event := request.responseToDSLMap(req, resp, "one.one.one.one", "one.one.one.one")
event := request.responseToDSLMap(req, resp, "one.one.one.one", "one.one.one.one", nil)
t.Run("valid", func(t *testing.T) {
matcher := &matchers.Matcher{
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: []string{"1.1.1.1"},
}
err = matcher.CompileMatchers()
@ -95,7 +95,7 @@ func TestDNSOperatorMatch(t *testing.T) {
t.Run("rcode", func(t *testing.T) {
matcher := &matchers.Matcher{
Part: "rcode",
Type: "status",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.StatusMatcher},
Status: []int{dns.RcodeSuccess},
}
err = matcher.CompileMatchers()
@ -109,7 +109,7 @@ func TestDNSOperatorMatch(t *testing.T) {
t.Run("negative", func(t *testing.T) {
matcher := &matchers.Matcher{
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Negative: true,
Words: []string{"random"},
}
@ -124,7 +124,7 @@ func TestDNSOperatorMatch(t *testing.T) {
t.Run("invalid", func(t *testing.T) {
matcher := &matchers.Matcher{
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: []string{"random"},
}
err := matcher.CompileMatchers()
@ -134,6 +134,30 @@ func TestDNSOperatorMatch(t *testing.T) {
require.False(t, isMatched, "could match invalid response matcher")
require.Equal(t, []string{}, matched)
})
t.Run("caseInsensitive", func(t *testing.T) {
req := new(dns.Msg)
req.Question = append(req.Question, dns.Question{Name: "ONE.ONE.ONE.ONE.", Qtype: dns.TypeA, Qclass: dns.ClassINET})
resp := new(dns.Msg)
resp.Rcode = dns.RcodeSuccess
resp.Answer = append(resp.Answer, &dns.A{A: net.ParseIP("1.1.1.1"), Hdr: dns.RR_Header{Name: "ONE.ONE.ONE.ONE."}})
event := request.responseToDSLMap(req, resp, "ONE.ONE.ONE.ONE", "ONE.ONE.ONE.ONE", nil)
matcher := &matchers.Matcher{
Part: "raw",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: []string{"one.ONE.one.ONE"},
CaseInsensitive: true,
}
err = matcher.CompileMatchers()
require.Nil(t, err, "could not compile matcher")
isMatch, matched := request.Match(event, matcher)
require.True(t, isMatch, "could not match valid response")
require.Equal(t, []string{"one.one.one.one"}, matched)
})
}
func TestDNSOperatorExtract(t *testing.T) {
@ -142,7 +166,7 @@ func TestDNSOperatorExtract(t *testing.T) {
testutils.Init(options)
templateID := "testing-dns"
request := &Request{
Type: "A",
RequestType: DNSRequestTypeHolder{DNSRequestType: A},
Class: "INET",
Retries: 5,
ID: templateID,
@ -163,12 +187,12 @@ func TestDNSOperatorExtract(t *testing.T) {
resp.Rcode = dns.RcodeSuccess
resp.Answer = append(resp.Answer, &dns.A{A: net.ParseIP("1.1.1.1"), Hdr: dns.RR_Header{Name: "one.one.one.one."}})
event := request.responseToDSLMap(req, resp, "one.one.one.one", "one.one.one.one")
event := request.responseToDSLMap(req, resp, "one.one.one.one", "one.one.one.one", nil)
t.Run("extract", func(t *testing.T) {
extractor := &extractors.Extractor{
Part: "raw",
Type: "regex",
Type: extractors.TypeHolder{ExtractorType: extractors.RegexExtractor},
Regex: []string{"[0-9]+\\.[0-9]+\\.[0-9]+\\.[0-9]+"},
}
err = extractor.CompileExtractors()
@ -181,7 +205,7 @@ func TestDNSOperatorExtract(t *testing.T) {
t.Run("kval", func(t *testing.T) {
extractor := &extractors.Extractor{
Type: "kval",
Type: extractors.TypeHolder{ExtractorType: extractors.KValExtractor},
KVal: []string{"rcode"},
}
err = extractor.CompileExtractors()
@ -199,7 +223,7 @@ func TestDNSMakeResult(t *testing.T) {
testutils.Init(options)
templateID := "testing-dns"
request := &Request{
Type: "A",
RequestType: DNSRequestTypeHolder{DNSRequestType: A},
Class: "INET",
Retries: 5,
ID: templateID,
@ -209,12 +233,12 @@ func TestDNSMakeResult(t *testing.T) {
Matchers: []*matchers.Matcher{{
Name: "test",
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: []string{"1.1.1.1"},
}},
Extractors: []*extractors.Extractor{{
Part: "raw",
Type: "regex",
Type: extractors.TypeHolder{ExtractorType: extractors.RegexExtractor},
Regex: []string{"[0-9]+\\.[0-9]+\\.[0-9]+\\.[0-9]+"},
}},
},
@ -233,7 +257,7 @@ func TestDNSMakeResult(t *testing.T) {
resp.Rcode = dns.RcodeSuccess
resp.Answer = append(resp.Answer, &dns.A{A: net.ParseIP("1.1.1.1"), Hdr: dns.RR_Header{Name: "one.one.one.one."}})
event := request.responseToDSLMap(req, resp, "one.one.one.one", "one.one.one.one")
event := request.responseToDSLMap(req, resp, "one.one.one.one", "one.one.one.one", nil)
finalEvent := &output.InternalWrappedEvent{InternalEvent: event}
if request.CompiledOperators != nil {
result, ok := request.CompiledOperators.Execute(event, request.Match, request.Extract, false)

View File

@ -1,6 +1,7 @@
package dns
import (
"encoding/hex"
"net/url"
"github.com/pkg/errors"
@ -11,10 +12,17 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/expressions"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/helpers/eventcreator"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/helpers/responsehighlighter"
"github.com/projectdiscovery/retryabledns"
templateTypes "github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
)
var _ protocols.Request = &Request{}
// Type returns the type of the protocol request
func (request *Request) Type() templateTypes.ProtocolType {
return templateTypes.DNSProtocol
}
// ExecuteWithResults executes the protocol requests and returns results instead of writing them.
func (request *Request) ExecuteWithResults(input string, metadata /*TODO review unused parameter*/, previous output.InternalEvent, callback protocols.OutputEventCallback) error {
// Parse the URL and return domain if URL.
@ -28,11 +36,19 @@ func (request *Request) ExecuteWithResults(input string, metadata /*TODO review
// Compile each request for the template based on the URL
compiledRequest, err := request.Make(domain)
if err != nil {
request.options.Output.Request(request.options.TemplateID, domain, "dns", err)
request.options.Output.Request(request.options.TemplatePath, domain, request.Type().String(), err)
request.options.Progress.IncrementFailedRequestsBy(1)
return errors.Wrap(err, "could not build request")
}
dnsClient := request.dnsClient
if varErr := expressions.ContainsUnresolvedVariables(request.Resolvers...); varErr != nil {
if dnsClient, varErr = request.getDnsClient(request.options, metadata); varErr != nil {
gologger.Warning().Msgf("[%s] Could not make dns request for %s: %v\n", request.options.TemplateID, domain, varErr)
return nil
}
}
requestString := compiledRequest.String()
if varErr := expressions.ContainsUnresolvedVariables(requestString); varErr != nil {
gologger.Warning().Msgf("[%s] Could not make dns request for %s: %v\n", request.options.TemplateID, domain, varErr)
@ -44,35 +60,71 @@ func (request *Request) ExecuteWithResults(input string, metadata /*TODO review
}
// Send the request to the target servers
resp, err := request.dnsClient.Do(compiledRequest)
response, err := dnsClient.Do(compiledRequest)
if err != nil {
request.options.Output.Request(request.options.TemplateID, domain, "dns", err)
request.options.Output.Request(request.options.TemplatePath, domain, request.Type().String(), err)
request.options.Progress.IncrementFailedRequestsBy(1)
}
if resp == nil {
if response == nil {
return errors.Wrap(err, "could not send dns request")
}
request.options.Progress.IncrementRequests()
request.options.Output.Request(request.options.TemplateID, domain, "dns", err)
gologger.Verbose().Msgf("[%s] Sent DNS request to %s", request.options.TemplateID, domain)
request.options.Output.Request(request.options.TemplatePath, domain, request.Type().String(), err)
gologger.Verbose().Msgf("[%s] Sent DNS request to %s\n", request.options.TemplateID, domain)
outputEvent := request.responseToDSLMap(compiledRequest, resp, input, input)
// perform trace if necessary
var tracedata *retryabledns.TraceData
if request.Trace {
tracedata, err = request.dnsClient.Trace(domain, request.question, request.TraceMaxRecursion)
if err != nil {
request.options.Output.Request(request.options.TemplatePath, domain, "dns", err)
}
}
outputEvent := request.responseToDSLMap(compiledRequest, response, input, input, tracedata)
for k, v := range previous {
outputEvent[k] = v
}
event := eventcreator.CreateEvent(request, outputEvent, request.options.Options.Debug || request.options.Options.DebugResponse)
// TODO: dynamic values are not supported yet
if request.options.Options.Debug || request.options.Options.DebugResponse {
gologger.Debug().Msgf("[%s] Dumped DNS response for %s", request.options.TemplateID, domain)
gologger.Print().Msgf("%s", responsehighlighter.Highlight(event.OperatorsResult, resp.String(), request.options.Options.NoColor))
dumpResponse(event, request.options, response.String(), domain)
if request.Trace {
dumpTraceData(event, request.options, traceToString(tracedata, true), domain)
}
callback(event)
return nil
}
func dumpResponse(event *output.InternalWrappedEvent, requestOptions *protocols.ExecuterOptions, response, domain string) {
cliOptions := requestOptions.Options
if cliOptions.Debug || cliOptions.DebugResponse {
hexDump := false
if responsehighlighter.HasBinaryContent(response) {
hexDump = true
response = hex.Dump([]byte(response))
}
highlightedResponse := responsehighlighter.Highlight(event.OperatorsResult, response, cliOptions.NoColor, hexDump)
gologger.Debug().Msgf("[%s] Dumped DNS response for %s\n\n%s", requestOptions.TemplateID, domain, highlightedResponse)
}
}
func dumpTraceData(event *output.InternalWrappedEvent, requestOptions *protocols.ExecuterOptions, tracedata, domain string) {
cliOptions := requestOptions.Options
if cliOptions.Debug || cliOptions.DebugResponse {
hexDump := false
if responsehighlighter.HasBinaryContent(tracedata) {
hexDump = true
tracedata = hex.Dump([]byte(tracedata))
}
highlightedResponse := responsehighlighter.Highlight(event.OperatorsResult, tracedata, cliOptions.NoColor, hexDump)
gologger.Debug().Msgf("[%s] Dumped DNS Trace data for %s\n\n%s", requestOptions.TemplateID, domain, highlightedResponse)
}
}
// isURL tests a string to determine if it is a well-structured url or not.
func isURL(toTest string) bool {
if _, err := url.ParseRequestURI(toTest); err != nil {

View File

@ -5,13 +5,13 @@ import (
"github.com/stretchr/testify/require"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
"github.com/projectdiscovery/nuclei/v2/pkg/operators/extractors"
"github.com/projectdiscovery/nuclei/v2/pkg/operators/matchers"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
func TestDNSExecuteWithResults(t *testing.T) {
@ -20,7 +20,7 @@ func TestDNSExecuteWithResults(t *testing.T) {
testutils.Init(options)
templateID := "testing-dns"
request := &Request{
Type: "A",
RequestType: DNSRequestTypeHolder{DNSRequestType: A},
Class: "INET",
Retries: 5,
ID: templateID,
@ -30,12 +30,12 @@ func TestDNSExecuteWithResults(t *testing.T) {
Matchers: []*matchers.Matcher{{
Name: "test",
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: []string{"93.184.216.34"},
}},
Extractors: []*extractors.Extractor{{
Part: "raw",
Type: "regex",
Type: extractors.TypeHolder{ExtractorType: extractors.RegexExtractor},
Regex: []string{"[0-9]+\\.[0-9]+\\.[0-9]+\\.[0-9]+"},
}},
},

View File

@ -5,9 +5,9 @@ import (
"github.com/stretchr/testify/require"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
func TestFileCompile(t *testing.T) {

View File

@ -8,9 +8,9 @@ import (
"github.com/stretchr/testify/require"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
func TestFindInputPaths(t *testing.T) {

View File

@ -16,17 +16,10 @@ import (
// Match matches a generic data response again a given matcher
func (request *Request) Match(data map[string]interface{}, matcher *matchers.Matcher) (bool, []string) {
partString := matcher.Part
switch partString {
case "body", "all", "data", "":
partString = "raw"
}
item, ok := data[partString]
itemStr, ok := request.getMatchPart(matcher.Part, data)
if !ok {
return false, []string{}
}
itemStr := types.ToString(item)
switch matcher.GetType() {
case matchers.SizeMatcher:
@ -45,17 +38,10 @@ func (request *Request) Match(data map[string]interface{}, matcher *matchers.Mat
// Extract performs extracting operation for an extractor on model and returns true or false.
func (request *Request) Extract(data map[string]interface{}, extractor *extractors.Extractor) map[string]struct{} {
partString := extractor.Part
switch partString {
case "body", "all", "data", "":
partString = "raw"
}
item, ok := data[partString]
itemStr, ok := request.getMatchPart(extractor.Part, data)
if !ok {
return nil
}
itemStr := types.ToString(item)
switch extractor.GetType() {
case extractors.RegexExtractor:
@ -66,12 +52,28 @@ func (request *Request) Extract(data map[string]interface{}, extractor *extracto
return nil
}
func (request *Request) getMatchPart(part string, data output.InternalEvent) (string, bool) {
switch part {
case "body", "all", "data", "":
part = "raw"
}
item, ok := data[part]
if !ok {
return "", false
}
itemStr := types.ToString(item)
return itemStr, true
}
// responseToDSLMap converts a file response to a map for use in DSL matching
func (request *Request) responseToDSLMap(raw, inputFilePath, matchedFileName string) output.InternalEvent {
return output.InternalEvent{
"path": inputFilePath,
"matched": matchedFileName,
"raw": raw,
"type": request.Type().String(),
"template-id": request.options.TemplateID,
"template-info": request.options.TemplateInfo,
"template-path": request.options.TemplatePath,
@ -119,10 +121,11 @@ func (request *Request) GetCompiledOperators() []*operators.Operators {
func (request *Request) MakeResultEventItem(wrapped *output.InternalWrappedEvent) *output.ResultEvent {
data := &output.ResultEvent{
MatcherStatus: true,
TemplateID: types.ToString(wrapped.InternalEvent["template-id"]),
TemplatePath: types.ToString(wrapped.InternalEvent["template-path"]),
Info: wrapped.InternalEvent["template-info"].(model.Info),
Type: "file",
Type: types.ToString(wrapped.InternalEvent["type"]),
Path: types.ToString(wrapped.InternalEvent["path"]),
Matched: types.ToString(wrapped.InternalEvent["matched"]),
Host: types.ToString(wrapped.InternalEvent["host"]),

View File

@ -5,13 +5,13 @@ import (
"github.com/stretchr/testify/require"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
"github.com/projectdiscovery/nuclei/v2/pkg/operators/extractors"
"github.com/projectdiscovery/nuclei/v2/pkg/operators/matchers"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
func TestResponseToDSLMap(t *testing.T) {
@ -35,7 +35,7 @@ func TestResponseToDSLMap(t *testing.T) {
resp := "test-data\r\n"
event := request.responseToDSLMap(resp, "one.one.one.one", "one.one.one.one")
require.Len(t, event, 6, "could not get correct number of items in dsl map")
require.Len(t, event, 7, "could not get correct number of items in dsl map")
require.Equal(t, resp, event["raw"], "could not get correct resp")
}
@ -60,13 +60,13 @@ func TestFileOperatorMatch(t *testing.T) {
resp := "test-data\r\n1.1.1.1\r\n"
event := request.responseToDSLMap(resp, "one.one.one.one", "one.one.one.one")
require.Len(t, event, 6, "could not get correct number of items in dsl map")
require.Len(t, event, 7, "could not get correct number of items in dsl map")
require.Equal(t, resp, event["raw"], "could not get correct resp")
t.Run("valid", func(t *testing.T) {
matcher := &matchers.Matcher{
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: []string{"1.1.1.1"},
}
err = matcher.CompileMatchers()
@ -80,7 +80,7 @@ func TestFileOperatorMatch(t *testing.T) {
t.Run("negative", func(t *testing.T) {
matcher := &matchers.Matcher{
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Negative: true,
Words: []string{"random"},
}
@ -95,7 +95,7 @@ func TestFileOperatorMatch(t *testing.T) {
t.Run("invalid", func(t *testing.T) {
matcher := &matchers.Matcher{
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: []string{"random"},
}
err := matcher.CompileMatchers()
@ -105,6 +105,26 @@ func TestFileOperatorMatch(t *testing.T) {
require.False(t, isMatched, "could match invalid response matcher")
require.Equal(t, []string{}, matched)
})
t.Run("caseInsensitive", func(t *testing.T) {
resp := "TEST-DATA\r\n1.1.1.1\r\n"
event := request.responseToDSLMap(resp, "one.one.one.one", "one.one.one.one")
require.Len(t, event, 7, "could not get correct number of items in dsl map")
require.Equal(t, resp, event["raw"], "could not get correct resp")
matcher := &matchers.Matcher{
Part: "raw",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: []string{"TeSt-DaTA"},
CaseInsensitive: true,
}
err = matcher.CompileMatchers()
require.Nil(t, err, "could not compile matcher")
isMatched, matched := request.Match(event, matcher)
require.True(t, isMatched, "could not match valid response")
require.Equal(t, []string{"test-data"}, matched)
})
}
func TestFileOperatorExtract(t *testing.T) {
@ -128,13 +148,13 @@ func TestFileOperatorExtract(t *testing.T) {
resp := "test-data\r\n1.1.1.1\r\n"
event := request.responseToDSLMap(resp, "one.one.one.one", "one.one.one.one")
require.Len(t, event, 6, "could not get correct number of items in dsl map")
require.Len(t, event, 7, "could not get correct number of items in dsl map")
require.Equal(t, resp, event["raw"], "could not get correct resp")
t.Run("extract", func(t *testing.T) {
extractor := &extractors.Extractor{
Part: "raw",
Type: "regex",
Type: extractors.TypeHolder{ExtractorType: extractors.RegexExtractor},
Regex: []string{"[0-9]+\\.[0-9]+\\.[0-9]+\\.[0-9]+"},
}
err = extractor.CompileExtractors()
@ -147,7 +167,7 @@ func TestFileOperatorExtract(t *testing.T) {
t.Run("kval", func(t *testing.T) {
extractor := &extractors.Extractor{
Type: "kval",
Type: extractors.TypeHolder{ExtractorType: extractors.KValExtractor},
KVal: []string{"raw"},
}
err = extractor.CompileExtractors()
@ -180,13 +200,13 @@ func testFileMakeResultOperators(t *testing.T, matcherCondition string) *output.
matcher := []*matchers.Matcher{
{
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: expectedValue,
},
{
Name: namedMatcherName,
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: expectedValue,
},
}
@ -230,7 +250,7 @@ func testFileMakeResult(t *testing.T, matchers []*matchers.Matcher, matcherCondi
Matchers: matchers,
Extractors: []*extractors.Extractor{{
Part: "raw",
Type: "regex",
Type: extractors.TypeHolder{ExtractorType: extractors.RegexExtractor},
Regex: []string{"[0-9]+\\.[0-9]+\\.[0-9]+\\.[0-9]+"},
}},
},
@ -246,7 +266,7 @@ func testFileMakeResult(t *testing.T, matchers []*matchers.Matcher, matcherCondi
fileContent := "test-data\r\n1.1.1.1\r\n"
event := request.responseToDSLMap(fileContent, "/tmp", matchedFileName)
require.Len(t, event, 6, "could not get correct number of items in dsl map")
require.Len(t, event, 7, "could not get correct number of items in dsl map")
require.Equal(t, fileContent, event["raw"], "could not get correct resp")
finalEvent := &output.InternalWrappedEvent{InternalEvent: event}

View File

@ -1,6 +1,7 @@
package file
import (
"encoding/hex"
"io/ioutil"
"os"
@ -13,10 +14,16 @@ import (
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/helpers/eventcreator"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/helpers/responsehighlighter"
"github.com/projectdiscovery/nuclei/v2/pkg/protocols/common/tostring"
templateTypes "github.com/projectdiscovery/nuclei/v2/pkg/templates/types"
)
var _ protocols.Request = &Request{}
// Type returns the type of the protocol request
func (request *Request) Type() templateTypes.ProtocolType {
return templateTypes.FileProtocol
}
// ExecuteWithResults executes the protocol requests and returns results instead of writing them.
func (request *Request) ExecuteWithResults(input string, metadata /*TODO review unused parameter*/, previous output.InternalEvent, callback protocols.OutputEventCallback) error {
wg := sizedwaitgroup.New(request.options.Options.BulkSize)
@ -49,30 +56,40 @@ func (request *Request) ExecuteWithResults(input string, metadata /*TODO review
gologger.Error().Msgf("Could not read file path %s: %s\n", filePath, err)
return
}
dataStr := tostring.UnsafeToString(buffer)
fileContent := tostring.UnsafeToString(buffer)
gologger.Verbose().Msgf("[%s] Sent FILE request to %s", request.options.TemplateID, filePath)
outputEvent := request.responseToDSLMap(dataStr, input, filePath)
outputEvent := request.responseToDSLMap(fileContent, input, filePath)
for k, v := range previous {
outputEvent[k] = v
}
event := eventcreator.CreateEvent(request, outputEvent, request.options.Options.Debug || request.options.Options.DebugResponse)
if request.options.Options.Debug || request.options.Options.DebugResponse {
gologger.Info().Msgf("[%s] Dumped file request for %s", request.options.TemplateID, filePath)
gologger.Print().Msgf("%s", responsehighlighter.Highlight(event.OperatorsResult, dataStr, request.options.Options.NoColor))
}
dumpResponse(event, request.options, fileContent, filePath)
callback(event)
}(data)
})
wg.Wait()
if err != nil {
request.options.Output.Request(request.options.TemplateID, input, "file", err)
request.options.Output.Request(request.options.TemplatePath, input, request.Type().String(), err)
request.options.Progress.IncrementFailedRequestsBy(1)
return errors.Wrap(err, "could not send file request")
}
request.options.Progress.IncrementRequests()
return nil
}
func dumpResponse(event *output.InternalWrappedEvent, requestOptions *protocols.ExecuterOptions, fileContent string, filePath string) {
cliOptions := requestOptions.Options
if cliOptions.Debug || cliOptions.DebugResponse {
hexDump := false
if responsehighlighter.HasBinaryContent(fileContent) {
hexDump = true
fileContent = hex.Dump([]byte(fileContent))
}
highlightedResponse := responsehighlighter.Highlight(event.OperatorsResult, fileContent, cliOptions.NoColor, hexDump)
gologger.Debug().Msgf("[%s] Dumped file request for %s\n\n%s", requestOptions.TemplateID, filePath, highlightedResponse)
}
}

View File

@ -8,13 +8,13 @@ import (
"github.com/stretchr/testify/require"
"github.com/projectdiscovery/nuclei/v2/internal/testutils"
"github.com/projectdiscovery/nuclei/v2/pkg/model"
"github.com/projectdiscovery/nuclei/v2/pkg/model/types/severity"
"github.com/projectdiscovery/nuclei/v2/pkg/operators"
"github.com/projectdiscovery/nuclei/v2/pkg/operators/extractors"
"github.com/projectdiscovery/nuclei/v2/pkg/operators/matchers"
"github.com/projectdiscovery/nuclei/v2/pkg/output"
"github.com/projectdiscovery/nuclei/v2/pkg/testutils"
)
func TestFileExecuteWithResults(t *testing.T) {
@ -32,12 +32,12 @@ func TestFileExecuteWithResults(t *testing.T) {
Matchers: []*matchers.Matcher{{
Name: "test",
Part: "raw",
Type: "word",
Type: matchers.MatcherTypeHolder{MatcherType: matchers.WordsMatcher},
Words: []string{"1.1.1.1"},
}},
Extractors: []*extractors.Extractor{{
Part: "raw",
Type: "regex",
Type: extractors.TypeHolder{ExtractorType: extractors.RegexExtractor},
Regex: []string{"[0-9]+\\.[0-9]+\\.[0-9]+\\.[0-9]+"},
}},
},

View File

@ -2,110 +2,6 @@ package engine
import "strings"
// ActionType defines the action type for a browser action
type ActionType int8
// Types to be executed by the user.
const (
// ActionNavigate performs a navigation to the specified URL
// URL can include nuclei payload data such as URL, Hostname, etc.
ActionNavigate ActionType = iota + 1
// ActionScript executes a JS snippet on the page.
ActionScript
// ActionClick performs the left-click action on an Element.
ActionClick
// ActionRightClick performs the right-click action on an Element.
ActionRightClick
// ActionTextInput performs an action for a text input
ActionTextInput
// ActionScreenshot performs the screenshot action writing to a file.
ActionScreenshot
// ActionTimeInput performs an action on a time input.
ActionTimeInput
// ActionSelectInput performs an action on a select input.
ActionSelectInput
// ActionFilesInput performs an action on a file input.
ActionFilesInput
// ActionWaitLoad waits for the page to stop loading.
ActionWaitLoad
// ActionGetResource performs a get resource action on an element
ActionGetResource
// ActionExtract performs an extraction on an element
ActionExtract
// ActionSetMethod sets the request method
ActionSetMethod
// ActionAddHeader adds a header to the request
ActionAddHeader
// ActionSetHeader sets a header in the request
ActionSetHeader
// ActionDeleteHeader deletes a header from the request
ActionDeleteHeader
// ActionSetBody sets the value of the request body
ActionSetBody
// ActionWaitEvent waits for a specific event.
ActionWaitEvent
// ActionKeyboard performs a keyboard action event on a page.
ActionKeyboard
// ActionDebug debug slows down headless and adds a sleep to each page.
ActionDebug
// ActionSleep executes a sleep for a specified duration
ActionSleep
// ActionWaitVisible waits until an element appears.
ActionWaitVisible
)
// ActionStringToAction converts an action from string to internal representation
var ActionStringToAction = map[string]ActionType{
"navigate": ActionNavigate,
"script": ActionScript,
"click": ActionClick,
"rightclick": ActionRightClick,
"text": ActionTextInput,
"screenshot": ActionScreenshot,
"time": ActionTimeInput,
"select": ActionSelectInput,
"files": ActionFilesInput,
"waitload": ActionWaitLoad,
"getresource": ActionGetResource,
"extract": ActionExtract,
"setmethod": ActionSetMethod,
"addheader": ActionAddHeader,
"setheader": ActionSetHeader,
"deleteheader": ActionDeleteHeader,
"setbody": ActionSetBody,
"waitevent": ActionWaitEvent,
"keyboard": ActionKeyboard,
"debug": ActionDebug,
"sleep": ActionSleep,
"waitvisible": ActionWaitVisible,
}
// ActionToActionString converts an action from internal representation to string
var ActionToActionString = map[ActionType]string{
ActionNavigate: "navigate",
ActionScript: "script",
ActionClick: "click",
ActionRightClick: "rightclick",
ActionTextInput: "text",
ActionScreenshot: "screenshot",
ActionTimeInput: "time",
ActionSelectInput: "select",
ActionFilesInput: "files",
ActionWaitLoad: "waitload",
ActionGetResource: "getresource",
ActionExtract: "extract",
ActionSetMethod: "set-method",
ActionAddHeader: "addheader",
ActionSetHeader: "setheader",
ActionDeleteHeader: "deleteheader",
ActionSetBody: "setbody",
ActionWaitEvent: "waitevent",
ActionKeyboard: "keyboard",
ActionDebug: "debug",
ActionSleep: "sleep",
ActionWaitVisible: "waitvisible",
}
// Action is an action taken by the browser to reach a navigation
//
// Each step that the browser executes is an action. Most navigations
@ -152,13 +48,13 @@ type Action struct {
// - "keyboard"
// - "debug"
// - "sleep"
ActionType string `yaml:"action" jsonschema:"title=action to perform,description=Type of actions to perform,enum=navigate,enum=script,enum=click,enum=rightclick,enum=text,enum=screenshot,enum=time,enum=select,enum=files,enum=waitload,enum=getresource,enum=extract,enum=setmethod,enum=addheader,enum=setheader,enum=deleteheader,enum=setbody,enum=waitevent,enum=keyboard,enum=debug,enum=sleep"`
ActionType ActionTypeHolder `yaml:"action" jsonschema:"title=action to perform,description=Type of actions to perform,enum=navigate,enum=script,enum=click,enum=rightclick,enum=text,enum=screenshot,enum=time,enum=select,enum=files,enum=waitload,enum=getresource,enum=extract,enum=setmethod,enum=addheader,enum=setheader,enum=deleteheader,enum=setbody,enum=waitevent,enum=keyboard,enum=debug,enum=sleep"`
}
// String returns the string representation of an action
func (a *Action) String() string {
builder := &strings.Builder{}
builder.WriteString(a.ActionType)
builder.WriteString(a.ActionType.String())
if a.Name != "" {
builder.WriteString(" Name:")
builder.WriteString(a.Name)

View File

@ -0,0 +1,185 @@
package engine
import (
"encoding/json"
"errors"
"strings"
"github.com/alecthomas/jsonschema"
)
// ActionType defines the action type for a browser action
type ActionType int8
// Types to be executed by the user.
const (
// ActionNavigate performs a navigation to the specified URL
// URL can include nuclei payload data such as URL, Hostname, etc.
ActionNavigate ActionType = iota + 1
// ActionScript executes a JS snippet on the page.
ActionScript
// ActionClick performs the left-click action on an Element.
ActionClick
// ActionRightClick performs the right-click action on an Element.
ActionRightClick
// ActionTextInput performs an action for a text input
ActionTextInput
// ActionScreenshot performs the screenshot action writing to a file.
ActionScreenshot
// ActionTimeInput performs an action on a time input.
ActionTimeInput
// ActionSelectInput performs an action on a select input.
ActionSelectInput
// ActionFilesInput performs an action on a file input.
ActionFilesInput
// ActionWaitLoad waits for the page to stop loading.
ActionWaitLoad
// ActionGetResource performs a get resource action on an element
ActionGetResource
// ActionExtract performs an extraction on an element
ActionExtract
// ActionSetMethod sets the request method
ActionSetMethod
// ActionAddHeader adds a header to the request
ActionAddHeader
// ActionSetHeader sets a header in the request
ActionSetHeader
// ActionDeleteHeader deletes a header from the request
ActionDeleteHeader
// ActionSetBody sets the value of the request body
ActionSetBody
// ActionWaitEvent waits for a specific event.
ActionWaitEvent
// ActionKeyboard performs a keyboard action event on a page.
ActionKeyboard
// ActionDebug debug slows down headless and adds a sleep to each page.
ActionDebug
// ActionSleep executes a sleep for a specified duration
ActionSleep
// ActionWaitVisible waits until an element appears.
ActionWaitVisible
// limit
limit
)
// ActionStringToAction converts an action from string to internal representation
var ActionStringToAction = map[string]ActionType{
"navigate": ActionNavigate,
"script": ActionScript,
"click": ActionClick,
"rightclick": ActionRightClick,
"text": ActionTextInput,
"screenshot": ActionScreenshot,
"time": ActionTimeInput,
"select": ActionSelectInput,
"files": ActionFilesInput,
"waitload": ActionWaitLoad,
"getresource": ActionGetResource,
"extract": ActionExtract,
"setmethod": ActionSetMethod,
"addheader": ActionAddHeader,
"setheader": ActionSetHeader,
"deleteheader": ActionDeleteHeader,
"setbody": ActionSetBody,
"waitevent": ActionWaitEvent,
"keyboard": ActionKeyboard,
"debug": ActionDebug,
"sleep": ActionSleep,
"waitvisible": ActionWaitVisible,
}
// ActionToActionString converts an action from internal representation to string
var ActionToActionString = map[ActionType]string{
ActionNavigate: "navigate",
ActionScript: "script",
ActionClick: "click",
ActionRightClick: "rightclick",
ActionTextInput: "text",
ActionScreenshot: "screenshot",
ActionTimeInput: "time",
ActionSelectInput: "select",
ActionFilesInput: "files",
ActionWaitLoad: "waitload",
ActionGetResource: "getresource",
ActionExtract: "extract",
ActionSetMethod: "set-method",
ActionAddHeader: "addheader",
ActionSetHeader: "setheader",
ActionDeleteHeader: "deleteheader",
ActionSetBody: "setbody",
ActionWaitEvent: "waitevent",
ActionKeyboard: "keyboard",
ActionDebug: "debug",
ActionSleep: "sleep",
ActionWaitVisible: "waitvisible",
}
// GetSupportedActionTypes returns list of supported types
func GetSupportedActionTypes() []ActionType {
var result []ActionType
for index := ActionType(1); index < limit; index++ {
result = append(result, index)
}
return result
}
func toActionTypes(valueToMap string) (ActionType, error) {
normalizedValue := normalizeValue(valueToMap)
for key, currentValue := range ActionToActionString {
if normalizedValue == currentValue {
return key, nil
}
}
return -1, errors.New("Invalid action type: " + valueToMap)
}
func normalizeValue(value string) string {
return strings.TrimSpace(strings.ToLower(value))
}
func (t ActionType) String() string {
return ActionToActionString[t]
}
// ActionTypeHolder is used to hold internal type of the action
type ActionTypeHolder struct {
ActionType ActionType
}
func (holder ActionTypeHolder) String() string {
return holder.ActionType.String()
}
func (holder ActionTypeHolder) JSONSchemaType() *jsonschema.Type {
gotType := &jsonschema.Type{
Type: "string",
Title: "action to perform",
Description: "Type of actions to perform,enum=navigate,enum=script,enum=click,enum=rightclick,enum=text,enum=screenshot,enum=time,enum=select,enum=files,enum=waitload,enum=getresource,enum=extract,enum=setmethod,enum=addheader,enum=setheader,enum=deleteheader,enum=setbody,enum=waitevent,enum=keyboard,enum=debug,enum=sleep",
}
for _, types := range GetSupportedActionTypes() {
gotType.Enum = append(gotType.Enum, types.String())
}
return gotType
}
func (holder *ActionTypeHolder) UnmarshalYAML(unmarshal func(interface{}) error) error {
var marshalledTypes string
if err := unmarshal(&marshalledTypes); err != nil {
return err
}
computedType, err := toActionTypes(marshalledTypes)
if err != nil {
return err
}
holder.ActionType = computedType
return nil
}
func (holder *ActionTypeHolder) MarshalJSON() ([]byte, error) {
return json.Marshal(holder.ActionType.String())
}
func (holder ActionTypeHolder) MarshalYAML() (interface{}, error) {
return holder.ActionType.String(), nil
}

Some files were not shown because too many files have changed in this diff Show More