Monday, June 23, 2025

SAS Azure

 

Azure AD credentials for User Delegation SAS: A complete implementation guide

User Delegation SAS tokens represent the most secure method for granting temporary access to Azure Storage, using Azure AD credentials instead of storage account keys. This guide clarifies exactly what credentials are involved and how to implement them properly in web applications.

Understanding the credential confusion

The key insight is that User Delegation SAS requires two separate identity concepts: the creating identity (your web app's Azure AD credentials) and the using identity (whoever uses the SAS token). Many developers confuse these roles, particularly when they can login to Azure Storage Explorer interactively but struggle with programmatic access.

When you login to Azure Storage Explorer, you're using personal Azure login credentials (interactive authentication). However, programmatic SAS creation in web applications requires different credential types designed for automated scenarios.

What Azure AD credentials are actually used for programmatic SAS creation

The authentication flow

User Delegation SAS creation follows a specific four-phase process:

  1. Azure AD Authentication: Your web application authenticates with Azure AD using one of several credential types
  2. User Delegation Key Acquisition: The authenticated app requests a special cryptographic key from Azure Storage
  3. SAS Token Generation: Your app signs the SAS parameters using this key
  4. SAS Token Usage: External users/applications can access storage using the SAS token (no Azure AD required)

Supported credential types for programmatic access

The Azure SDK uses the TokenCredential interface with these implementations:

Managed Identity (Recommended for production)

  • ManagedIdentityCredential: Uses Azure's built-in identity service
  • No secrets to manage, automatic credential rotation
  • Only works on Azure resources (App Service, Functions, VMs, etc.)

Service Principal (For external or cross-platform scenarios)

  • ClientSecretCredential: Uses application ID and secret
  • CertificateCredential: Uses application ID and certificate
  • Requires manual credential management and rotation

Development/Testing

  • DefaultAzureCredential: Automatically discovers available authentication methods
  • AzureCliCredential: Uses Azure CLI authentication context
  • InteractiveBrowserCredential: For interactive scenarios

Critical difference: Creator vs User identities

Creating Identity (Your Web App):           Using Identity (SAS Token Holder):
├─ Must be Azure AD authenticated          ├─ No Azure AD authentication needed
├─ Needs RBAC permissions                  ├─ No RBAC permissions required
├─ Identity embedded in SAS token          ├─ Identity not tracked by Azure
├─ Audit trail of SAS creation            ├─ Only usage logged anonymously
└─ Limited to 7-day SAS validity          └─ Access controlled by SAS permissions

Setting up proper Azure AD identity for your web app

Step 1: Choose the right credential type

For Azure-hosted web apps (App Service, Functions, Container Apps):

# Enable system-assigned managed identity
az webapp identity assign --name mywebapp --resource-group myResourceGroup

# Get the principal ID for role assignment
PRINCIPAL_ID=$(az webapp identity show --name mywebapp --resource-group myResourceGroup --query principalId --output tsv)

For external or multi-platform apps:

# Create service principal
az ad sp create-for-rbac --name "mywebapp-sp" --role "Storage Blob Delegator" \
  --scopes "/subscriptions/{subscription-id}/resourceGroups/{resource-group}/providers/Microsoft.Storage/storageAccounts/{storage-account}"

Step 2: Assign required permissions

The creating identity needs this specific action:

  • Microsoft.Storage/storageAccounts/blobServices/generateUserDelegationKey

Built-in roles that include this permission:

  • Storage Blob Delegator (minimum required)
  • Storage Blob Data Contributor (if you also need data access)
  • Storage Blob Data Owner (for full access)
# Assign the minimum required role
az role assignment create \
  --role "Storage Blob Delegator" \
  --assignee $PRINCIPAL_ID \
  --scope "/subscriptions/{subscription-id}/resourceGroups/{resource-group}/providers/Microsoft.Storage/storageAccounts/{storage-account}"

Implementation examples

C#/.NET with Managed Identity

using Azure.Identity;
using Azure.Storage.Blobs;
using Azure.Storage.Sas;

public class UserDelegationSasService
{
    private readonly BlobServiceClient _blobServiceClient;
    private readonly string _storageAccountName;

    public UserDelegationSasService(string storageAccountName)
    {
        _storageAccountName = storageAccountName;
        var endpoint = $"https://{storageAccountName}.blob.core.windows.net";
        
        // DefaultAzureCredential automatically discovers managed identity
        _blobServiceClient = new BlobServiceClient(new Uri(endpoint), new DefaultAzureCredential());
    }

    public async Task<string> CreateUserDelegationSasAsync(
        string containerName, 
        string blobName, 
        TimeSpan validity)
    {
        // Step 1: Get user delegation key using Azure AD credentials
        var userDelegationKey = await _blobServiceClient.GetUserDelegationKeyAsync(
            DateTimeOffset.UtcNow,
            DateTimeOffset.UtcNow.Add(validity));

        // Step 2: Create SAS builder with desired permissions
        var sasBuilder = new BlobSasBuilder
        {
            BlobContainerName = containerName,
            BlobName = blobName,
            Resource = "b", // "b" for blob, "c" for container
            StartsOn = DateTimeOffset.UtcNow.AddMinutes(-5), // Clock skew tolerance
            ExpiresOn = DateTimeOffset.UtcNow.Add(validity)
        };

        // Step 3: Set specific permissions (principle of least privilege)
        sasBuilder.SetPermissions(BlobSasPermissions.Read | BlobSasPermissions.Write);

        // Step 4: Generate signed SAS token
        var sasToken = sasBuilder.ToSasQueryParameters(
            userDelegationKey, 
            _storageAccountName);
        
        return sasToken.ToString();
    }
}

// ASP.NET Core integration
public void ConfigureServices(IServiceCollection services)
{
    services.AddSingleton<UserDelegationSasService>(provider =>
    {
        var configuration = provider.GetRequiredService<IConfiguration>();
        var storageAccountName = configuration["Azure:StorageAccountName"];
        return new UserDelegationSasService(storageAccountName);
    });
}

JavaScript/Node.js with Service Principal

const { BlobServiceClient, generateBlobSASQueryParameters, BlobSASPermissions } = require('@azure/storage-blob');
const { ClientSecretCredential } = require('@azure/identity');

class UserDelegationSasService {
    constructor(storageAccountName, tenantId, clientId, clientSecret) {
        this.storageAccountName = storageAccountName;
        
        // Service principal authentication
        const credential = new ClientSecretCredential(tenantId, clientId, clientSecret);
        
        this.blobServiceClient = new BlobServiceClient(
            `https://${storageAccountName}.blob.core.windows.net`,
            credential
        );
    }

    async createUserDelegationSas(containerName, blobName, validityHours = 1) {
        const now = new Date();
        const startTime = new Date(now.getTime() - 5 * 60 * 1000); // 5 minutes ago
        const expiryTime = new Date(now.getTime() + validityHours * 60 * 60 * 1000);

        // Get user delegation key
        const userDelegationKey = await this.blobServiceClient.getUserDelegationKey(
            startTime,
            expiryTime
        );

        // Generate SAS with specific permissions
        const sasQueryParams = generateBlobSASQueryParameters({
            containerName,
            blobName,
            permissions: BlobSASPermissions.parse('rw'), // read, write
            startsOn: startTime,
            expiresOn: expiryTime
        }, userDelegationKey, this.storageAccountName);

        return sasQueryParams.toString();
    }
}

Storage account access vs Azure AD authentication relationship

User Delegation SAS operates under a dual permission model:

Level 1: Azure AD RBAC Permissions (Creation Time)

  • Required for generating User Delegation Keys
  • Evaluated when your app calls generateUserDelegationKey
  • Does not grant direct data access, only SAS creation capability
  • Scoped at subscription, resource group, storage account, or container level

Level 2: SAS Token Permissions (Usage Time)

  • Specified in the SAS token itself (read, write, delete, list, etc.)
  • Independent of the creator's RBAC permissions
  • Can be more restrictive than creator's permissions
  • Evaluated for each storage operation using the SAS

Permission interaction examples

Scenario 1: High privilege creator, limited SAS

Creator RBAC: Storage Blob Data Owner (full access)
SAS Permissions: Read only
Result: SAS users can only read, despite creator having full access

Scenario 2: Creator with delegation permission only

Creator RBAC: Storage Blob Delegator (SAS creation only)
SAS Permissions: Read/Write
Result: SAS creation succeeds, tokens work for read/write operations

Multi-tenant web application considerations

Tenant isolation strategies

Container-per-tenant approach:

public async Task<string> CreateTenantSasAsync(string tenantId, string fileName, TimeSpan validity)
{
    // Use tenant-specific container
    var containerName = $"tenant-{tenantId}";
    
    // Ensure container exists with proper access controls
    var containerClient = _blobServiceClient.GetBlobContainerClient(containerName);
    await containerClient.CreateIfNotExistsAsync();
    
    return await CreateUserDelegationSasAsync(containerName, fileName, validity);
}

Path-based isolation:

public async Task<string> CreateTenantSasAsync(string tenantId, string fileName, TimeSpan validity)
{
    // Use tenant-specific path within shared container
    var blobName = $"tenants/{tenantId}/{fileName}";
    
    return await CreateUserDelegationSasAsync("shared-container", blobName, validity);
}

Azure AD configuration for multi-tenancy

For B2B scenarios (business tenants):

var options = new DefaultAzureCredentialOptions
{
    TenantId = guestTenantId // Target tenant for B2B scenario
};

var credential = new DefaultAzureCredential(options);

Best practices for credential management

Production deployment checklist

✅ Security best practices:

  • Use managed identities for Azure-hosted applications
  • Implement certificate-based authentication for service principals
  • Never hardcode credentials in source code
  • Store secrets in Azure Key Vault when required
  • Enable comprehensive audit logging

✅ Operational best practices:

  • Implement short-lived SAS tokens (1-24 hours typical)
  • Cache user delegation keys (reuse for up to 7 days)
  • Set up proper error handling for authentication failures
  • Monitor SAS token usage patterns
  • Implement token revocation procedures

✅ Multi-tenant considerations:

  • Ensure proper tenant isolation in storage structure
  • Validate tenant context before SAS generation
  • Implement tenant-specific monitoring and alerting
  • Consider data residency requirements per tenant

Common security pitfalls to avoid

❌ Authentication mistakes:

  • Using personal login credentials in production
  • Overly permissive application permissions
  • Weak session management practices
  • Misconfigured redirect URIs

❌ SAS token security issues:

  • Overly broad SAS permissions
  • Extended validity periods without revocation
  • Distributing SAS tokens over HTTP
  • Lack of usage monitoring

❌ Credential management errors:

  • Hardcoded credentials in source code
  • Shared credentials across applications
  • Missing credential rotation procedures
  • Insufficient credential protection

Monitoring and compliance

Essential monitoring setup

Azure Storage logs to track:

  • User delegation key generation events
  • SAS token usage patterns
  • Failed authentication attempts
  • Unusual access patterns

Key alerts to configure:

  • Unusual SAS token usage patterns
  • Failed authentication attempts
  • Access from unexpected locations
  • High-volume data access

Compliance considerations

Regulatory requirements:

  • GDPR: Data encryption, access logging, audit trails
  • HIPAA: Access controls, audit logging, encryption
  • SOC 2: Security controls, monitoring, incident response

Azure Policy enforcement:

  • Disable shared key access on storage accounts
  • Require encryption in transit and at rest
  • Enforce network access restrictions
  • Mandate audit logging

Conclusion

User Delegation SAS with Azure AD credentials provides enterprise-grade security for storage access delegation. The key is understanding that your web application's Azure AD identity (managed identity or service principal) creates the SAS tokens, while the actual users of those tokens don't need Azure AD authentication.

For immediate implementation:

  1. Use managed identity if your web app runs on Azure
  2. Assign "Storage Blob Delegator" role to your app's identity
  3. Use DefaultAzureCredential in your code for automatic credential discovery
  4. Implement short-lived SAS tokens with minimal required permissions
  5. Set up comprehensive monitoring and audit logging

This approach eliminates the security risks of storage account keys while providing flexible, auditable access control for your multi-tenant web application.

Saturday, June 21, 2025

azure blob storage

Azure File Sharing Architecture Report

Executive Summary

This document evaluates different approaches for securely sharing CSV files with external partner organizations through our multi-tenant web application. The key requirement is to share files that auto-expire after 30 days while preventing unauthorized access through email forwarding.

Business Context

  • Current Setup: Multi-tenant web application with Microsoft Graph API permissions from partner organizations
  • Requirement: Share CSV files with external users from partner organizations
  • Security Concern: Prevent unauthorized access if emails containing file links are forwarded
  • Compliance: Files must automatically expire/delete after 30 days
  • Architecture: Cross-tenant scenario (our storage account in our tenant, external users in their own tenants)

Technical Approaches Evaluated

1. SAS (Shared Access Signature) Tokens - ❌ Not Recommended

Description: Generate time-limited URLs that provide direct access to Azure Storage files.

Types of SAS:

  • Regular SAS: Created using storage account keys
  • User Delegation SAS: Created using Azure AD credentials (more secure)
  • Account SAS: Grants access to entire storage account
  • Service SAS: Scoped to specific service (Blob, Queue, Table, File)

Implementation:

1. Generate SAS token with 30-day expiry
2. Email SAS URL directly to users
3. Users access files directly from Azure Storage

Advantages:

  • Simple implementation
  • Direct access to Azure Storage (good performance)
  • Built-in expiry mechanism

Critical Disadvantages:

  • Forwarding Risk: Anyone who receives forwarded email can access files
  • No identity verification: Bearer token approach
  • Limited audit trail: Difficult to track who actually accessed files
  • No granular permission control: Access is binary (have URL = access)

Verdict: ❌ Rejected due to security concerns

2. Azure AD Authentication + Web App Gateway - ✅ Recommended

Description: External users authenticate through our multi-tenant web app, which then serves files directly from storage.

Implementation:

1. User clicks link in email → Redirected to our web app
2. User authenticates using their organizational credentials (Azure AD)
3. Our app validates user is from authorized organization
4. Our app reads file from storage using managed identity
5. Our app serves file directly to authenticated user

Authentication Flow:

  • External Users: OAuth 2.0 through multi-tenant Azure AD
  • App to Storage: Managed Identity with Storage Blob Data Reader role
  • File Lifecycle: Azure Blob Lifecycle Management (30-day auto-deletion)

Advantages:

  • No forwarding risk: Users must authenticate to access files
  • Identity-based security: Leverages existing organizational credentials
  • Full audit trail: Complete logging of who accessed what
  • Granular control: Can restrict access by user/organization
  • Leverages existing infrastructure: Uses current multi-tenant app setup
  • Microsoft's recommended approach: Follows security best practices

Disadvantages:

  • Uses app bandwidth for file serving
  • Slightly more complex implementation
  • App becomes bottleneck for large files

Performance Considerations:

  • Suitable for typical CSV file sizes
  • May impact app performance under heavy concurrent usage

3. Hybrid Approach: Azure AD + Dynamic SAS Generation - ✅ Alternative Option

Description: Users authenticate through web app, which then generates short-lived SAS tokens for direct storage access.

Implementation:

1. User clicks link in email → Redirected to our web app
2. User authenticates using organizational credentials
3. Our app validates permissions for specific file
4. Our app generates short-lived SAS token (15-30 minutes)
5. User downloads directly from Azure Storage using SAS token

Advantages:

  • No forwarding risk: Authentication required before SAS generation
  • Better performance: Direct downloads from Azure Storage
  • Scalability: App only handles authentication, not file serving
  • Cost efficiency: Lower bandwidth costs for app
  • Identity-based security: Maintains authentication requirements

Disadvantages:

  • More complex implementation (SAS generation logic)
  • Slightly larger attack surface (SAS tokens exist, even if short-lived)

Alternative Approaches Considered and Rejected

Azure AD B2B (Guest Users)

Why Rejected: Requires manual invitation process and guest user management overhead for each partner organization user.

Direct Azure AD Authentication to Storage

Why Rejected: Complex cross-tenant configuration required since our storage is in our tenant while external users are in their own tenants.

Shared Key Authorization

Why Rejected: Requires sharing sensitive storage account keys; not suitable for external user scenarios.

Technical Architecture Details

Storage Configuration

  • Service: Azure Blob Storage
  • Lifecycle Management: Automatic deletion after 30 days
  • Access Control: Role-Based Access Control (RBAC)
  • Required Role: Storage Blob Data Reader (for app's managed identity)

Authentication Infrastructure

  • Identity Provider: Microsoft Entra ID (formerly Azure AD)
  • App Registration: Multi-tenant configuration
  • Token Type: OAuth 2.0 access tokens
  • Cross-tenant Support: Built-in multi-tenant capabilities

Security Features

  • Encryption: HTTPS for all communications
  • Audit Logging: Full request/response logging
  • Identity Verification: Organizational credential requirements
  • Permission Scoping: Granular access control per user/organization

Recommendations

Primary Recommendation: Azure AD + Web App Gateway

Rationale:

  • Addresses all security requirements
  • Leverages existing multi-tenant infrastructure
  • Provides complete audit trail
  • Follows Microsoft security best practices
  • Suitable for typical CSV file sharing scenarios

Implementation Priority: High

Secondary Option: Hybrid Approach

When to Consider:

  • Large file sizes (>10MB)
  • High concurrent usage expected
  • Performance optimization is critical

Implementation Priority: Medium (consider for future optimization)

Implementation Roadmap

Phase 1: Core Implementation (2-3 weeks)

  1. Configure Azure Blob Lifecycle Management (30-day deletion)
  2. Set up managed identity for web app
  3. Implement authentication endpoint for external users
  4. Create file serving endpoint with permission validation
  5. Update email templates with web app links

Phase 2: Optimization (1-2 weeks)

  1. Add comprehensive logging and monitoring
  2. Implement file access analytics
  3. Add user-friendly error handling
  4. Performance testing and optimization

Phase 3: Future Enhancements (Optional)

  1. Consider hybrid approach if performance issues arise
  2. Add file preview capabilities
  3. Implement bulk download features

Risk Assessment

Security Risks: Low

  • Authentication required for all access
  • No direct storage URLs in emails
  • Complete audit trail
  • Automatic file expiry

Performance Risks: Low-Medium

  • App bandwidth usage for file serving
  • Potential bottleneck under high load
  • Mitigated by typical CSV file sizes

Implementation Risks: Low

  • Leverages existing authentication infrastructure
  • Well-documented Azure services
  • Clear migration path from current setup

Cost Implications

Storage Costs

  • Standard Azure Blob Storage pricing
  • Lifecycle management included
  • Minimal cost impact

Compute Costs

  • Slight increase in app compute usage for file serving
  • Offset by improved security and compliance

Development Costs

  • Estimated 3-4 weeks development time
  • Leverages existing team Azure expertise

Conclusion

The Azure AD Authentication + Web App Gateway approach provides the optimal balance of security, functionality, and implementation simplicity for our use case. It eliminates the email forwarding security risk while leveraging our existing multi-tenant infrastructure and following Microsoft's recommended security practices.

The hybrid approach should be considered as a future optimization if performance requirements change or file sizes significantly increase.

Monday, May 12, 2025

C# Polymorphism, Extension method

 Okay, here's a sample daily learning log based on our conversation. Imagine you went through these discoveries over a few days:


My Learning Log: C# Method Resolution & Dapper

Date: May 10, 2025 (Simulated Day 1)

  • Topic/Problem: My custom LoggingDbConnection.QueryAsync() method (which adds SQL logging) isn't being called. Instead, a different QueryAsync seems to be executing when I use my IDbConnection variable.
  • What I Learned/Discovered:
    • The _dbConnection variable in my service was declared as IDbConnection.
    • Dapper provides QueryAsync as an extension method on IDbConnection.
    • My LoggingDbConnection class is being instantiated and used, but its specific QueryAsync method is being bypassed.
  • Challenges/Confusions: Why is Dapper's version called when my object is clearly a LoggingDbConnection which has its own QueryAsync? I thought the object's actual type would determine the method.
  • Solutions/Clarifications:
    • The issue might be related to how C# resolves method calls when interfaces and extension methods are involved.
    • Potential Fix 1: Changing the declared type of _dbConnection in my service from IDbConnection to the concrete LoggingDbConnection.
    • Potential Fix 2: Explicitly casting _dbConnection to (LoggingDbConnection) before calling .QueryAsync().
    • Outcome: Trying these fixes showed that my LoggingDbConnection.QueryAsync was then called! So the compiler needed to know it was dealing with a LoggingDbConnection at the call site.
  • Key Rule/Guideline Remembered (Initial thought): The compiler needs a specific "hint" (cast or concrete type) to choose my special version of a method if the variable is just an interface type.

Date: May 11, 2025 (Simulated Day 2)

  • Topic/Problem: Deep dive into why the compiler prefers Dapper's extension method over my instance method when the variable is typed as IDbConnection.
  • What I Learned/Discovered:
    • C# method resolution is primarily based on the compile-time (declared) type of the variable.
    • The Compiler's "Method Hunt" Process:
      1. It first checks if the declared type itself (e.g., IDbConnection) has an instance method with the called signature.
      2. If not found on the declared type, then it looks for applicable extension methods for that declared type.
    • IDbConnection itself does not define QueryAsync. So, check #1 fails for IDbConnection.
    • Dapper provides QueryAsync as an extension method for IDbConnection. So, check #2 finds Dapper's method.
  • Challenges/Confusions: The "priority" of extension methods vs. polymorphism. I thought polymorphism (runtime object type determining the method) would always take precedence.
  • Solutions/Clarifications:
    • Polymorphism applies to methods that are part of the declared type's contract (e.g., an interface method, or a virtual method being overridden).
    • Since QueryAsync isn't part of the IDbConnection contract, polymorphism doesn't directly apply to make it choose LoggingDbConnection.QueryAsync when the variable is just IDbConnection.
    • It's not that extension methods have "higher priority" than polymorphism; it's that the conditions for invoking polymorphism (for that specific method call on an IDbConnection variable) weren't met because IDbConnection doesn't define QueryAsync.
  • Key Rule/Guideline Remembered: Compiler checks Instance methods on Declared Type FIRST. If none, THEN it looks for Extension methods for Declared Type. Polymorphism applies to methods defined by the declared type's contract.

Date: May 12, 2025 (Simulated Day 3)

  • Topic/Problem: How to make my database access (with logging) more flexible and adhere to the Open/Closed Principle (OCP).
  • What I Learned/Discovered:
    • OCP: Open for extension, closed for modification.
    • My quick fixes (casting or using concrete LoggingDbConnection type in the service) make the service class itself less OCP-compliant because it now knows about specific implementations.
    • A better OCP approach: Define a new interface (e.g., IDataQueryExecutor) that explicitly includes the methods my service needs (like QueryAsync, QuerySingleOrDefaultAsync).
    • My LoggingDbConnection (or a new LoggingQueryExecutor) would implement this new interface.
    • The service would then depend on IDataQueryExecutor, not IDbConnection or LoggingDbConnection.
  • Challenges/Confusions: This OCP solution seemed very complex with new interfaces and changing dependencies. Felt overwhelming.
  • Solutions/Clarifications:
    • The OCP approach is an ideal for maintainability, especially in larger systems.
    • The core idea is to make the service depend on a "contract" (the new interface) rather than a concrete implementation, allowing different implementations (logging, caching, none) to be swapped without changing the service.
    • It's okay if the immediate fix was simpler; OCP is a goal one can refactor towards.
  • Key Rule/Guideline Remembered (for OCP context): Depend on abstractions (interfaces defining needed operations), not concretions, to allow easier extension without modifying existing client code.

This log captures the essence of the learning journey, from encountering a problem to understanding the underlying C# mechanisms and finally considering advanced design principles.









Polymorphism only applies to methods declared in the base class or interface.
The compiler only uses methods it sees in the declared type — and will never "guess" based on the actual runtime type unless you're using virtual or an interface method.

Let's tackle these excellent questions one by one!

Question 1: "so polymorphism will only apply for Virtual?"

You're very close! virtual methods are a cornerstone of polymorphism in C# (specifically, overriding in derived classes). But it's a bit broader:

Polymorphism (the ability for an object to take on many forms, and for a call to a method to execute different code depending on the runtime type of the object) primarily applies in these C# scenarios:

  1. virtual methods: When a base class declares a method as virtual, derived classes can override it. When you call that method on a base class variable that holds a derived class object, the derived class's overridden version is executed. (This is what you mentioned).
  2. abstract methods: An abstract method in an abstract class must be implemented by any concrete derived class. This is a form of polymorphism because the abstract class variable can hold different concrete derived types, each with its own implementation of that abstract method.
  3. Interface Method Implementations: This is very relevant to our IDbConnection discussion! When a class implements an interface, it provides concrete implementations for the methods defined in that interface. If you have a variable of the interface type (e.g., IDbConnection myConnection), and it holds an object of a class that implements that interface (e.g., LoggingDbConnection), calling an interface method (like myConnection.Open()) will execute the specific implementation provided by LoggingDbConnection.

The common thread: For polymorphism to work as described (where the runtime type's method is chosen), the method being called must be part of the contract defined by the declared type of the variable (the interface contract, or the base class's virtual/abstract method signature).


Question 2: "if an extension method of QueryAsync didn't exist would it then had picked up LoggingDbConnection.QueryAsync() or not?" (Assuming _dbConnection is still declared as IDbConnection).

That's another fantastic question to really solidify the method resolution rules! Let's trace the compiler's "Method Hunt" for this exact scenario:

  • Your variable: IDbConnection _dbConnection; (which at runtime holds a LoggingDbConnection instance).
  • Your call: _dbConnection.QueryAsync<T>(...)
  • Your hypothetical: Dapper's QueryAsync extension method for IDbConnection does not exist.

Compiler's Process:

  1. Step 1: Instance Method on Declared Type (IDbConnection)?

    • The compiler looks at IDbConnection.
    • Does the IDbConnection interface itself define a method QueryAsync<T> with a matching signature?
    • Answer: No. QueryAsync is not part of the standard IDbConnection interface contract.
    • Result of Step 1: No direct instance method found on the declared type.
  2. Step 2: Extension Method for Declared Type (IDbConnection)?

    • The compiler looks for accessible extension methods that target IDbConnection and are named QueryAsync<T> with a matching signature.
    • In your hypothetical scenario, you said "if an extension method of QueryAsync didn't exist."
    • Answer: No. (Because we're pretending Dapper's or any other relevant extension method isn't there).
    • Result of Step 2: No extension method found.

Conclusion for Question 2:

Since both Step 1 (instance method on IDbConnection) and Step 2 (extension method for IDbConnection) failed to find a suitable QueryAsync method, the compiler would NOT have picked up LoggingDbConnection.QueryAsync().

Instead, you would get a compile-time error. The error message would be something like: 'System.Data.IDbConnection' does not contain a definition for 'QueryAsync' and no accessible extension method 'QueryAsync' accepting a first argument of type 'System.Data.IDbConnection' could be found (are you missing a using directive or an assembly reference?) 1  

Why? The compiler doesn't automatically "peek" inside the runtime object (LoggingDbConnection) for methods that aren't part of the declared type's (IDbConnection) contract or provided as an extension for that declared type. The method resolution is based on what's visible and contracted through the lens of the declared type at compile time.

Your LoggingDbConnection.QueryAsync() instance method is only "visible" to the compiler without a cast if the variable _dbConnection is itself declared as type LoggingDbConnection.

Friday, April 18, 2025

Principles

 Measure twice, cut once

The difference between a smart programmer and a professional programmer is that the professional understands that clarity is king. Professionals write code that others can understand.”

Tuesday, March 11, 2025

Virtualization

 <!DOCTYPE html>

<html lang="en">

<head>

    <meta charset="UTF-8">

    <meta name="viewport" content="width=device-width, initial-scale=1.0">

    <title>Virtualized Table Demo</title>

    <style>

        body {

            font-family: Arial, sans-serif;

            margin: 20px;

        }

        

        h1 {

            color: #333;

        }

        

        .stats {

            background-color: #f4f4f4;

            padding: 10px;

            margin: 10px 0;

            border-radius: 4px;

        }

        

        #table-container {

            border: 1px solid #ccc;

            border-radius: 4px;

            width: 100%;

            max-width: 800px;

            box-shadow: 0 2px 5px rgba(0,0,0,0.1);

        }

        

        .row-content {

            display: flex;

            border-bottom: 1px solid #eee;

            background-color: #fff;

        }

        

        .row-content:hover {

            background-color: #f9f9f9;

        }

        

        .cell {

            padding: 10px;

            flex: 1;

            overflow: hidden;

            text-overflow: ellipsis;

            white-space: nowrap;

        }

        

        .header {

            display: flex;

            background-color: #f0f0f0;

            font-weight: bold;

            border-bottom: 2px solid #ddd;

        }

        

        .header .cell {

            padding: 10px;

        }

        

        /* Different background colors for even rows */

        .virtual-row:nth-child(even) .row-content {

            background-color: #f7f7f7;

        }

        

        .controls {

            margin: 20px 0;

        }

        

        button {

            padding: 8px 15px;

            background-color: #4CAF50;

            color: white;

            border: none;

            border-radius: 4px;

            cursor: pointer;

            margin-right: 10px;

        }

        

        button:hover {

            background-color: #45a049;

        }

    </style>

</head>

<body>

    <h1>Virtualized Table Demo</h1>

    

    <div class="stats">

        <div>Total Rows: <span id="total-rows">10,000</span></div>

        <div>DOM Elements: <span id="dom-elements">0</span></div>

        <div>Current Scroll Index: <span id="current-index">0</span></div>

    </div>

    

    <div class="controls">

        <button id="scroll-to-middle">Scroll to Middle</button>

        <button id="scroll-to-end">Scroll to End</button>

        <button id="scroll-to-start">Scroll to Start</button>

    </div>

    

    <div class="header">

        <div class="cell">ID</div>

        <div class="cell">Name</div>

        <div class="cell">Email</div>

        <div class="cell">City</div>

    </div>

    

    <div id="table-container"></div>

    

    <script>

        // Generate a large amount of mock data

        function generateMockData(count) {

            const names = ['John', 'Jane', 'Michael', 'Emily', 'David', 'Sarah', 'Robert', 'Olivia', 'William', 'Sophia'];

            const lastNames = ['Smith', 'Johnson', 'Williams', 'Brown', 'Jones', 'Miller', 'Davis', 'Garcia', 'Wilson', 'Martinez'];

            const cities = ['New York', 'Los Angeles', 'Chicago', 'Houston', 'Phoenix', 'Philadelphia', 'San Antonio', 'San Diego', 'Dallas', 'San Jose'];

            

            return Array.from({ length: count }, (_, i) => ({

                id: i + 1,

                name: `${names[Math.floor(Math.random() * names.length)]} ${lastNames[Math.floor(Math.random() * lastNames.length)]}`,

                email: `user${i + 1}@example.com`,

                city: cities[Math.floor(Math.random() * cities.length)]

            }));

        }

        

        class VirtualizedTable {

            constructor(options) {

                this.container = options.container;

                this.data = options.data || [];

                this.rowHeight = options.rowHeight || 40;

                this.visibleRows = options.visibleRows || 10;

                this.bufferRows = options.bufferRows || 5;

                

                this.totalRows = this.data.length;

                this.totalHeight = this.totalRows * this.rowHeight;

                this.renderedRows = [];

                

                this.setupContainer();

                this.renderInitialView();

                this.attachScrollHandler();

                

                // Update stats

                document.getElementById('total-rows').textContent = this.totalRows.toLocaleString();

            }

            

            setupContainer() {

                this.container.style.position = 'relative';

                this.container.style.overflow = 'auto';

                this.container.style.height = (this.visibleRows * this.rowHeight) + 'px';

                

                this.spacer = document.createElement('div');

                this.spacer.style.height = this.totalHeight + 'px';

                this.spacer.style.width = '100%';

                this.spacer.style.position = 'relative';

                this.container.appendChild(this.spacer);

            }

            

            renderInitialView() {

                const totalRowsToRender = this.visibleRows + (this.bufferRows * 2);

                

                for (let i = 0; i < Math.min(totalRowsToRender, this.totalRows); i++) {

                    this.createRowElement(i);

                }

                

                // Update DOM elements count in stats

                document.getElementById('dom-elements').textContent = this.renderedRows.length;

            }

            

            createRowElement(dataIndex) {

                const rowData = this.data[dataIndex];

                

                const rowElement = document.createElement('div');

                rowElement.className = 'virtual-row';

                rowElement.style.position = 'absolute';

                rowElement.style.top = (dataIndex * this.rowHeight) + 'px';

                rowElement.style.height = this.rowHeight + 'px';

                rowElement.style.width = '100%';

                

                rowElement.innerHTML = `

                    <div class="row-content">

                        <div class="cell">${rowData.id}</div>

                        <div class="cell">${rowData.name}</div>

                        <div class="cell">${rowData.email}</div>

                        <div class="cell">${rowData.city}</div>

                    </div>

                `;

                

                rowElement.dataset.virtualIndex = dataIndex;

                

                this.spacer.appendChild(rowElement);

                this.renderedRows.push({

                    element: rowElement,

                    dataIndex: dataIndex

                });

                

                return rowElement;

            }

            

            updateRowElement(rowObj, newDataIndex) {

                const { element } = rowObj;

                const rowData = this.data[newDataIndex];

                

                element.style.top = (newDataIndex * this.rowHeight) + 'px';

                

                element.innerHTML = `

                    <div class="row-content">

                        <div class="cell">${rowData.id}</div>

                        <div class="cell">${rowData.name}</div>

                        <div class="cell">${rowData.email}</div>

                        <div class="cell">${rowData.city}</div>

                    </div>

                `;

                

                element.dataset.virtualIndex = newDataIndex;

                rowObj.dataIndex = newDataIndex;

            }

            

            attachScrollHandler() {

                this.container.addEventListener('scroll', () => {

                    this.updateVisibleRows();

                    

                    // Update current index in stats

                    const firstVisibleIndex = Math.floor(this.container.scrollTop / this.rowHeight);

                    document.getElementById('current-index').textContent = firstVisibleIndex;

                });

            }

            

            updateVisibleRows() {

                const scrollTop = this.container.scrollTop;

                

                const firstVisibleIndex = Math.floor(scrollTop / this.rowHeight);

                

                const startIndex = Math.max(0, firstVisibleIndex - this.bufferRows);

                const endIndex = Math.min(

                    this.totalRows - 1,

                    firstVisibleIndex + this.visibleRows + this.bufferRows

                );

                

                const rowsToUpdate = this.renderedRows.filter(row => {

                    return row.dataIndex < startIndex || row.dataIndex > endIndex;

                });

                

                const indicesToShow = [];

                for (let i = startIndex; i <= endIndex; i++) {

                    const isRendered = this.renderedRows.some(row => row.dataIndex === i);

                    if (!isRendered) {

                        indicesToShow.push(i);

                    }

                }

                

                for (let i = 0; i < Math.min(rowsToUpdate.length, indicesToShow.length); i++) {

                    this.updateRowElement(rowsToUpdate[i], indicesToShow[i]);

                }

            }

            

            scrollToIndex(index) {

                const targetIndex = Math.min(Math.max(0, index), this.totalRows - 1);

                this.container.scrollTop = targetIndex * this.rowHeight;

            }

        }

        

        // Initialize the table when the page loads

        document.addEventListener('DOMContentLoaded', () => {

            const container = document.getElementById('table-container');

            const mockData = generateMockData(10000); // 10,000 rows

            

            const virtualTable = new VirtualizedTable({

                container: container,

                data: mockData,

                rowHeight: 40,

                visibleRows: 15,

                bufferRows: 5

            });

            

            // Setup scroll buttons

            document.getElementById('scroll-to-middle').addEventListener('click', () => {

                virtualTable.scrollToIndex(5000);

            });

            

            document.getElementById('scroll-to-end').addEventListener('click', () => {

                virtualTable.scrollToIndex(9999);

            });

            

            document.getElementById('scroll-to-start').addEventListener('click', () => {

                virtualTable.scrollToIndex(0);

            });

        });

    </script>

</body>

</html>

Tuesday, March 4, 2025

jquery plugin

 // Simple stateful toggle plugin

(function($) {

    'use strict';

    

    $.fn.toggleState = function(options) {

        // Default options

        var settings = $.extend({

            activeClass: 'active',

            inactiveClass: 'inactive',

            onToggle: function() {}

        }, options);

        

        return this.each(function() {

            var $element = $(this);

            

            // Get the existing state from the element or initialize it

            var state = $element.data('plugin_toggleState');

            

            // If the plugin hasn't been initialized on this element yet

            if (!state) {

                // Initialize the state and store it using $.data()

                state = {

                    isActive: false,

                    toggleCount: 0

                };

                

                // Store the state object on the DOM element

                $element.data('plugin_toggleState', state);

                

                // Initialize the element appearance

                $element.addClass(settings.inactiveClass);

                

                // Set up click handler

                $element.on('click.toggleState', function() {

                    $element.toggleState('toggle');

                });

            }

            

            // Method invocation handling

            if (typeof options === 'string') {

                if (options === 'toggle') {

                    // Toggle the state

                    state.isActive = !state.isActive;

                    state.toggleCount++;

                    

                    // Update the element

                    if (state.isActive) {

                        $element.removeClass(settings.inactiveClass).addClass(settings.activeClass);

                    } else {

                        $element.removeClass(settings.activeClass).addClass(settings.inactiveClass);

                    }

                    

                    // Update the stored state

                    $element.data('plugin_toggleState', state);

                    

                    // Call the callback

                    settings.onToggle.call($element, state.isActive, state.toggleCount);

                }

                else if (options === 'status') {

                    // Return the current state

                    return state;

                }

                else if (options === 'destroy') {

                    // Clean up

                    $element.removeData('plugin_toggleState');

                    $element.off('.toggleState');

                    $element.removeClass(settings.activeClass + ' ' + settings.inactiveClass);

                }

            }

        });

    };

})(jQuery);


// Usage:

$(document).ready(function() {

    // Initialize the plugin

    $('.toggle-button').toggleState({

        activeClass: 'btn-success',

        inactiveClass: 'btn-secondary',

        onToggle: function(isActive, count) {

            console.log('Button toggled to: ' + (isActive ? 'active' : 'inactive'));

            console.log('Button has been toggled ' + count + ' times');

        }

    });

    

    // Get the state

    $('#statusButton').on('click', function() {

        var state = $('.toggle-button').first().toggleState('status');

        alert('First button state: ' + (state.isActive ? 'active' : 'inactive') + 

              '\nToggle count: ' + state.toggleCount);

    });

    

    // Programmatically toggle

    $('#toggleAllButton').on('click', function() {

        $('.toggle-button').toggleState('toggle');

    });

    

    // Destroy the plugin

    $('#resetButton').on('click', function() {

        $('.toggle-button').toggleState('destroy');

    });

});

Monday, February 24, 2025

Generics example

 using NLog;

using System;

using System.Collections.Generic;

using System.Linq;

using System.Web;

using System.Web.Mvc;


namespace ICS360.Models

{    

    public class PaginationHelper

    {

        private readonly ILogger _logger;


        public PaginationHelper(ILogger logger)

        {

            _logger = logger;

        }


        public JsonResult GetFilteredData<T>(

            List<T> sourceData,

            int take,

            int skip,

            string filter,

            Func<T, string> filterProperty,

            string[] selectedIds,

            Func<T, string> idProperty) where T : IBaseFilterResult

        {

            try

            {

                var filteredData = sourceData;


                // Apply filter search text if provided

                if (!string.IsNullOrEmpty(filter))

                {

                    filteredData = sourceData

                        .Where(item => filterProperty(item)

                            .IndexOf(filter, StringComparison.OrdinalIgnoreCase) >= 0)

                        .ToList();

                }


                var total = filteredData.Count;


                // Sort with selected items first, then by display name

                var orderedData = filteredData

                    .OrderByDescending(item => selectedIds != null &&

                                              selectedIds.Contains(idProperty(item)))

                    .ThenBy(item => filterProperty(item))

                    .ToList();


                var pagedData = orderedData

                    .Skip(skip)

                    .Take(take)

                     .Select(item =>

                     {

                         item.IsSelected = selectedIds != null && selectedIds.Contains(idProperty(item));

                         return item;

                     })                     

                    .ToList();


                return new JsonResult

                {

                    Data = new { Data = pagedData, Total = total },

                    JsonRequestBehavior = JsonRequestBehavior.AllowGet

                };

            }

            catch (Exception ex)

            {

                _logger.Error(ex);

                return new JsonResult

                {

                    Data = new { Data = new List<T>(), Total = 0 },

                    JsonRequestBehavior = JsonRequestBehavior.AllowGet

                };

            }

        }


        public JsonResult GetIndicesByIds<T, TId>(

       List<T> sourceData,

       List<TId> ids,

       Func<T, TId> idProperty) where TId : IEquatable<TId>

        {

            if (ids == null || !ids.Any())

            {

                return new JsonResult

                {

                    Data = new List<int>(),

                    JsonRequestBehavior = JsonRequestBehavior.AllowGet

                };

            }


            try

            {

                var indices = ids

                    .Select(id => sourceData.FindIndex(item => idProperty(item).Equals(id)))

                    .Where(index => index >= 0)

                    .ToList();


                return new JsonResult

                {

                    Data = indices,

                    JsonRequestBehavior = JsonRequestBehavior.AllowGet

                };

            }

            catch (Exception ex)

            {

                _logger.Error(ex);

                return new JsonResult

                {

                    Data = new List<int>(),

                    JsonRequestBehavior = JsonRequestBehavior.AllowGet

                };

            }

        }

    }

}