Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Forbid extra fields in YMLs #208

Open
wants to merge 108 commits into
base: release_v4.2.0
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
108 commits
Select commit Hold shift + click to select a range
6f982e7
Code which still needs testing
pyth0n1c May 15, 2024
1f7afd7
add proper access for sc_admin
pyth0n1c May 15, 2024
9caf4f0
renamed acs_deploy to deploy_acs
josehelps Jun 25, 2024
1c2ef2f
Merge branch 'main' into enable_acs_deploy
pyth0n1c Jul 3, 2024
d5b08d4
Re-enable acs deployment
pyth0n1c Jul 3, 2024
60b6e1b
Add an extra, missing field to the lookup.py model called max_matches…
pyth0n1c Jul 27, 2024
fd33140
enable error for extra keys in Pydantic Objects
pyth0n1c Jul 27, 2024
e4f7dcc
update template to remove risk_score since it is a comptued_field and…
pyth0n1c Jul 27, 2024
a453237
replaced class Config by ConfigDict
mathieugonzales Aug 14, 2024
25601d9
replaced deprecated Pydantic v1 validators
mathieugonzales Aug 14, 2024
22aa2e4
Merge branch 'main' into replace_deprecated_pydantic_validators
ljstella Aug 15, 2024
bbcacda
Merge branch 'main' into enable_acs_deploy
pyth0n1c Aug 22, 2024
a1c0915
Beginnings of drilldown support
pyth0n1c Aug 23, 2024
0b48ce4
Relax requirement on search
pyth0n1c Aug 27, 2024
81d01da
Merge branch 'main' into add_drilldown_support
pyth0n1c Aug 27, 2024
b3e7330
more progess on drilldown updates
pyth0n1c Aug 28, 2024
8a9d8ec
Merge branch 'main' into enable_acs_deploy
pyth0n1c Aug 29, 2024
1e51d6d
Add optional explanation field to detections
ryanplasma Aug 8, 2024
cbb56db
Merge branch 'main' into add_explanation
pyth0n1c Sep 13, 2024
72e3354
Merge branch 'main' into add_drilldown_support
pyth0n1c Sep 17, 2024
34d0ff6
Merge branch 'main' into add_explanation
ryanplasma Sep 18, 2024
9ba9300
Add the possibility
pyth0n1c Sep 18, 2024
3c5f9f0
Merge branch 'main' into add_explanation
pyth0n1c Sep 18, 2024
7d9d128
due to a parsing issue with events created
pyth0n1c Sep 24, 2024
a17256b
Merge pull request #179 from splunk/release_v4.2.0
pyth0n1c Sep 26, 2024
8c21622
Merge branch 'main' into add_explanation
pyth0n1c Sep 26, 2024
eedd07e
Added some documentation and
pyth0n1c Sep 26, 2024
a199c72
Merge pull request #296 from splunk/ryanplasma_add_explanation
pyth0n1c Sep 26, 2024
506bbaf
Merge branch 'main' into add_detection_type_list
pyth0n1c Sep 26, 2024
dde564b
Merge pull request #293 from splunk/add_detection_type_list
pyth0n1c Sep 26, 2024
a609c03
Remove erroneous spaces from datasources used by contentctl new --typ…
pyth0n1c Sep 26, 2024
5488ca6
Merge pull request #297 from splunk/contentctl_data_source_from_enum
pyth0n1c Sep 26, 2024
9a83a3c
Merge branch 'main' into add_drilldown_support
pyth0n1c Sep 26, 2024
20e8840
remove deffault values for earliesT_offset and latest_offset.
pyth0n1c Sep 26, 2024
a849e34
experimenting with updating
pyth0n1c Sep 26, 2024
7bde9d7
Update template and drilldown object
pyth0n1c Sep 26, 2024
c0cff81
Switch drilldowns to dump in json format so
pyth0n1c Sep 27, 2024
5ca8ade
Fix serialization issue with drilldowns.
pyth0n1c Sep 27, 2024
9f30e62
Merge branch 'main' into mathieugonzales_replace_deprecated_pydantic_…
pyth0n1c Sep 27, 2024
3280fbf
remove some debugging
pyth0n1c Sep 27, 2024
5226073
Merge branch 'main' into enable_acs_deploy
pyth0n1c Sep 27, 2024
098905a
make sure that content-version is written to the correct directory.
pyth0n1c Sep 27, 2024
db7de0b
fixes to ensure that every search that
pyth0n1c Oct 1, 2024
3a4be5d
Raise exception on parse of unittest from yml. Do this rather than tr…
pyth0n1c Oct 4, 2024
c627d2e
In rare cases, if there is a new piece of
pyth0n1c Oct 5, 2024
1f15302
first go at testing on ds changes
ljstella Oct 7, 2024
32690df
Update xmltodict requirement from ^0.13.0 to >=0.13,<0.15
dependabot[bot] Oct 9, 2024
b12383e
commit simple changes so that
pyth0n1c Oct 9, 2024
d79a0a4
Improve logic for regex and macro
pyth0n1c Oct 10, 2024
2123454
add helper func: get_all_indexes
ax-hsmith Oct 14, 2024
8575d9f
increment version number
ax-hsmith Oct 14, 2024
2a470b8
change get_all_indexes back to the way it was when it was working proper
ax-hsmith Oct 14, 2024
3db70d7
forgot the return statement on get_all_indexes
ax-hsmith Oct 14, 2024
3d33130
Merge pull request #305 from splunk/simple_allow_missing_detections
pyth0n1c Oct 15, 2024
3e8e85d
Merge branch 'main' into dependabot/pip/xmltodict-gte-0.13-and-lt-0.15
pyth0n1c Oct 15, 2024
db6763a
Merge branch 'main' into raise_exception_on_malformatted_tests
pyth0n1c Oct 15, 2024
2cc708b
Merge pull request #304 from splunk/dependabot/pip/xmltodict-gte-0.13…
pyth0n1c Oct 15, 2024
8f73477
Merge branch 'main' into raise_exception_on_malformatted_tests
pyth0n1c Oct 15, 2024
31b4b21
refactoring for formatting and some logical error correction
cmcginley-splunk Oct 15, 2024
02eb5d7
Merge pull request #300 from splunk/raise_exception_on_malformatted_t…
pyth0n1c Oct 15, 2024
d3e063a
Merge pull request #308 from splunk/cmcginley/mathieugonzales_replace…
pyth0n1c Oct 15, 2024
1550aff
Merge branch 'main' into mathieugonzales_replace_deprecated_pydantic_…
pyth0n1c Oct 15, 2024
80aa067
Merge branch 'main' into add_drilldown_support
pyth0n1c Oct 15, 2024
c558216
Merge pull request #298 from splunk/mathieugonzales_replace_deprecate…
pyth0n1c Oct 15, 2024
a4f4222
Merge branch 'main' into add_drilldown_support
pyth0n1c Oct 15, 2024
fca535b
add drilldowns to default search included on contentctl init
pyth0n1c Oct 15, 2024
f2caab0
Merge branch 'add_drilldown_support' of https://github.com/splunk/con…
pyth0n1c Oct 15, 2024
f7a939b
Merge pull request #256 from splunk/add_drilldown_support
pyth0n1c Oct 15, 2024
fe17a1e
Merge branch 'main' into fix/tests-with-custom-indexes
pyth0n1c Oct 15, 2024
6052ef0
Merge pull request #307 from ax-hsmith/fix/tests-with-custom-indexes
pyth0n1c Oct 15, 2024
bf72575
add function annotation and bake the default index into the get_all_i…
pyth0n1c Oct 16, 2024
50704d2
Throw much better and descriptive exception when triyng to replay to …
pyth0n1c Oct 16, 2024
adf0f90
don't bump the version number
pyth0n1c Oct 16, 2024
cfda377
Merge pull request #309 from splunk/all_more_custom_indexes
pyth0n1c Oct 16, 2024
9c2bdff
Merge branch 'main' into enable_acs_deploy
pyth0n1c Oct 16, 2024
23f3742
Merge branch 'main' into test_on_app_change
ljstella Oct 16, 2024
adbbcb5
add a cli flag
patel-bhavin Oct 19, 2024
98808d5
eric feedback
patel-bhavin Oct 21, 2024
d12a173
still optional
patel-bhavin Oct 21, 2024
f7204a1
default to devleop
patel-bhavin Oct 21, 2024
cc84524
updating config
patel-bhavin Oct 21, 2024
3c884d9
udpating toml
patel-bhavin Oct 21, 2024
d4d7d9d
Merge pull request #311 from splunk/release_notes_udpate
pyth0n1c Oct 21, 2024
0dad956
remove "cloud" from the security_domain
pyth0n1c Oct 22, 2024
6bcb875
Fix path to fetch a saved
pyth0n1c Oct 22, 2024
b580278
Fix path that was updated incorrectly. This path is used to find a sa…
pyth0n1c Oct 22, 2024
c9dfa84
forgot to save before
pyth0n1c Oct 22, 2024
f9bcd7e
Merge pull request #316 from splunk/fix_savedsearches_path
pyth0n1c Oct 23, 2024
98f9921
Merge branch 'main' into fix_security_domain
pyth0n1c Oct 23, 2024
35d8b82
Merge pull request #314 from splunk/fix_security_domain
pyth0n1c Oct 25, 2024
825d854
Merge branch 'main' into test_on_app_change
ljstella Oct 28, 2024
81fa46e
Update pyproject.toml
pyth0n1c Oct 30, 2024
7f5319e
Update pyproject.toml
pyth0n1c Oct 30, 2024
89b8ad3
Merge branch 'main' into test_on_app_change
ljstella Oct 31, 2024
34ae585
Ensure we print the right field
ljstella Nov 6, 2024
dd77dc6
Merge branch 'main' into enable_acs_deploy
pyth0n1c Nov 6, 2024
59a3d1c
Merge pull request #146 from splunk/enable_acs_deploy
pyth0n1c Nov 6, 2024
81db497
Update pyproject.toml
pyth0n1c Nov 6, 2024
45b3a87
Merge branch 'main' into data_sources_clarification
ljstella Nov 6, 2024
3c733f1
Merge pull request #324 from splunk/data_sources_clarification
pyth0n1c Nov 6, 2024
af0ff41
Merge branch 'main' into test_on_app_change
ljstella Nov 6, 2024
bbe5da8
Merge branch 'main' into exception_on_extra_fields
pyth0n1c Nov 7, 2024
4d9a831
Typing
ljstella Nov 12, 2024
3c9395c
Version bump
ljstella Nov 12, 2024
b8b5c2d
Merge pull request #301 from splunk/test_on_app_change
pyth0n1c Nov 12, 2024
b4a9217
Merge branch 'main' into exception_on_extra_fields
pyth0n1c Nov 12, 2024
ef7784d
Move Baseline datamodel from YML field
pyth0n1c Nov 12, 2024
a27f790
make datamodel a computed
pyth0n1c Nov 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion contentctl/actions/build.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,9 @@ def execute(self, input_dto: BuildInputDto) -> DirectorOutputDto:
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.lookups, SecurityContentType.lookups))
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.macros, SecurityContentType.macros))
updated_conf_files.update(conf_output.writeObjects(input_dto.director_output_dto.dashboards, SecurityContentType.dashboards))
updated_conf_files.update(conf_output.writeAppConf())
updated_conf_files.update(conf_output.writeMiscellaneousAppFiles())



#Ensure that the conf file we just generated/update is syntactically valid
for conf_file in updated_conf_files:
Expand Down
81 changes: 49 additions & 32 deletions contentctl/actions/deploy_acs.py
Original file line number Diff line number Diff line change
@@ -1,38 +1,55 @@
from dataclasses import dataclass
from contentctl.input.director import DirectorInputDto
from contentctl.output.conf_output import ConfOutput


from typing import Union

@dataclass(frozen=True)
class ACSDeployInputDto:
director_input_dto: DirectorInputDto
splunk_api_username: str
splunk_api_password: str
splunk_cloud_jwt_token: str
splunk_cloud_stack: str
stack_type: str
from contentctl.objects.config import deploy_acs, StackType
from requests import post
import pprint


class Deploy:
def execute(self, input_dto: ACSDeployInputDto) -> None:

conf_output = ConfOutput(input_dto.director_input_dto.input_path, input_dto.director_input_dto.config)
def execute(self, config: deploy_acs, appinspect_token:str) -> None:

appinspect_token = conf_output.inspectAppAPI(input_dto.splunk_api_username, input_dto.splunk_api_password, input_dto.stack_type)
#The following common headers are used by both Clasic and Victoria
headers = {
'Authorization': f'Bearer {config.splunk_cloud_jwt_token}',
'ACS-Legal-Ack': 'Y'
}
try:

with open(config.getPackageFilePath(include_version=False),'rb') as app_data:
#request_data = app_data.read()
if config.stack_type == StackType.classic:
# Classic instead uses a form to store token and package
# https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Config/ManageApps#Manage_private_apps_using_the_ACS_API_on_Classic_Experience
address = f"https://admin.splunk.com/{config.splunk_cloud_stack}/adminconfig/v2/apps"

form_data = {
'token': (None, appinspect_token),
'package': app_data
}
res = post(address, headers=headers, files = form_data)
elif config.stack_type == StackType.victoria:
# Victoria uses the X-Splunk-Authorization Header
# It also uses --data-binary for the app content
# https://docs.splunk.com/Documentation/SplunkCloud/9.1.2308/Config/ManageApps#Manage_private_apps_using_the_ACS_API_on_Victoria_Experience
headers.update({'X-Splunk-Authorization': appinspect_token})
address = f"https://admin.splunk.com/{config.splunk_cloud_stack}/adminconfig/v2/apps/victoria"
res = post(address, headers=headers, data=app_data.read())
else:
raise Exception(f"Unsupported stack type: '{config.stack_type}'")
except Exception as e:
raise Exception(f"Error installing to stack '{config.splunk_cloud_stack}' (stack_type='{config.stack_type}') via ACS:\n{str(e)}")


if input_dto.splunk_cloud_jwt_token is None or input_dto.splunk_cloud_stack is None:
if input_dto.splunk_cloud_jwt_token is None:
raise Exception("Cannot deploy app via ACS, --splunk_cloud_jwt_token was not defined on command line.")
else:
raise Exception("Cannot deploy app via ACS, --splunk_cloud_stack was not defined on command line.")

conf_output.deploy_via_acs(input_dto.splunk_cloud_jwt_token,
input_dto.splunk_cloud_stack,
appinspect_token,
input_dto.stack_type)

try:
# Request went through and completed, but may have returned a non-successful error code.
# This likely includes a more verbose response describing the error
res.raise_for_status()
print(res.json())
except Exception as e:
try:
error_text = res.json()
except Exception as e:
error_text = "No error text - request failed"
formatted_error_text = pprint.pformat(error_text)
print("While this may not be the cause of your error, ensure that the uid and appid of your Private App does not exist in Splunkbase\n"
"ACS cannot deploy and app with the same uid or appid as one that exists in Splunkbase.")
raise Exception(f"Error installing to stack '{config.splunk_cloud_stack}' (stack_type='{config.stack_type}') via ACS:\n{formatted_error_text}")


print(f"'{config.getPackageFilePath(include_version=False)}' successfully installed to stack '{config.splunk_cloud_stack}' (stack_type='{config.stack_type}') via ACS!")
35 changes: 23 additions & 12 deletions contentctl/actions/detection_testing/GitService.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
from contentctl.objects.macro import Macro
from contentctl.objects.lookup import Lookup
from contentctl.objects.detection import Detection
from contentctl.objects.data_source import DataSource
from contentctl.objects.security_content_object import SecurityContentObject
from contentctl.objects.config import test_common, All, Changes, Selected

Expand Down Expand Up @@ -67,9 +68,12 @@ def getChanges(self, target_branch:str)->List[Detection]:

#Make a filename to content map
filepath_to_content_map = { obj.file_path:obj for (_,obj) in self.director.name_to_content_map.items()}
updated_detections:List[Detection] = []
updated_macros:List[Macro] = []
updated_lookups:List[Lookup] =[]

updated_detections: set[Detection] = set()
updated_macros: set[Macro] = set()
updated_lookups: set[Lookup] = set()
updated_datasources: set[DataSource] = set()


for diff in all_diffs:
if type(diff) == pygit2.Patch:
Expand All @@ -80,16 +84,23 @@ def getChanges(self, target_branch:str)->List[Detection]:
if decoded_path.is_relative_to(self.config.path/"detections") and decoded_path.suffix == ".yml":
detectionObject = filepath_to_content_map.get(decoded_path, None)
if isinstance(detectionObject, Detection):
updated_detections.append(detectionObject)
updated_detections.add(detectionObject)
else:
raise Exception(f"Error getting detection object for file {str(decoded_path)}")

elif decoded_path.is_relative_to(self.config.path/"macros") and decoded_path.suffix == ".yml":
macroObject = filepath_to_content_map.get(decoded_path, None)
if isinstance(macroObject, Macro):
updated_macros.append(macroObject)
updated_macros.add(macroObject)
else:
raise Exception(f"Error getting macro object for file {str(decoded_path)}")

elif decoded_path.is_relative_to(self.config.path/"data_sources") and decoded_path.suffix == ".yml":
datasourceObject = filepath_to_content_map.get(decoded_path, None)
if isinstance(datasourceObject, DataSource):
updated_datasources.add(datasourceObject)
else:
raise Exception(f"Error getting data source object for file {str(decoded_path)}")

elif decoded_path.is_relative_to(self.config.path/"lookups"):
# We need to convert this to a yml. This means we will catch
Expand All @@ -98,7 +109,7 @@ def getChanges(self, target_branch:str)->List[Detection]:
updatedLookup = filepath_to_content_map.get(decoded_path, None)
if not isinstance(updatedLookup,Lookup):
raise Exception(f"Expected {decoded_path} to be type {type(Lookup)}, but instead if was {(type(updatedLookup))}")
updated_lookups.append(updatedLookup)
updated_lookups.add(updatedLookup)

elif decoded_path.suffix == ".csv":
# If the CSV was updated, we want to make sure that we
Expand All @@ -115,7 +126,6 @@ def getChanges(self, target_branch:str)->List[Detection]:
# Detected a changed .mlmodel file. However, since we do not have testing for these detections at
# this time, we will ignore this change.
updatedLookup = None


else:
raise Exception(f"Detected a changed file in the lookups/ directory '{str(decoded_path)}'.\n"
Expand All @@ -125,7 +135,7 @@ def getChanges(self, target_branch:str)->List[Detection]:
if updatedLookup is not None and updatedLookup not in updated_lookups:
# It is possible that both the CSV and YML have been modified for the same lookup,
# and we do not want to add it twice.
updated_lookups.append(updatedLookup)
updated_lookups.add(updatedLookup)

else:
pass
Expand All @@ -136,24 +146,25 @@ def getChanges(self, target_branch:str)->List[Detection]:

# If a detection has at least one dependency on changed content,
# then we must test it again
changed_macros_and_lookups = updated_macros + updated_lookups

changed_macros_and_lookups_and_datasources:set[SecurityContentObject] = updated_macros.union(updated_lookups, updated_datasources)

for detection in self.director.detections:
if detection in updated_detections:
# we are already planning to test it, don't need
# to add it again
continue

for obj in changed_macros_and_lookups:
for obj in changed_macros_and_lookups_and_datasources:
if obj in detection.get_content_dependencies():
updated_detections.append(detection)
updated_detections.add(detection)
break

#Print out the names of all modified/new content
modifiedAndNewContentString = "\n - ".join(sorted([d.name for d in updated_detections]))

print(f"[{len(updated_detections)}] Pieces of modifed and new content (this may include experimental/deprecated/manual_test content):\n - {modifiedAndNewContentString}")
return updated_detections
return sorted(list(updated_detections))

def getSelected(self, detectionFilenames: List[FilePath]) -> List[Detection]:
filepath_to_content_map: dict[FilePath, SecurityContentObject] = {
Expand Down
Loading
Loading