During a recent project, we experienced some issues with notifications not working. Outlook was giving notifications. Still, the program isn’t Outlook, but Microsoft.Office.OUTLOOK.EXE.15. Looking at the Notifications & Actions settings, it shows “Microsoft Outlook” without an icon and is turned Off; turning it on isn’t possible. It switches back to Off.
As shown in the screenshot below:
After hours of troubleshooting, I concluded that everything seemed to work when I first signed without any policies and then signed with policies. That looked strange to me because I thought that a policy, when applied, would have changed the appropriate setting to the same as it would have been when I started with all the policies applied.
I had to find which policy and setting were causing this issue. After troubleshooting for hours and hours again, by signing in, signing out, removing the profile, and repeating, I finally discovered that the policy setting “Remove common program groups from the start menu” was causing this issue. User Configuration > Administrative Templates > Start Menu and Taskbar > Remove Common program groups from the Start Menu
Solution
As we managed the start menu using Citrix WEM to keep it as clean as possible, we needed a workaround to fix this. The solution was to use FSLogix Appmasking to hide all non-default icons. I used the article from James Kindon (thanks for the tips) to create a clean start menu. After making the changes, the users must sign in two or three times before everything works correctly. Below, you see how it is supposed to work.
Correct Outlook NotificationWorking Notifications & Actions
As mentioned before, I used James Kindon’s tips to create the App Masking rules. Below, you will see a subset of all the hiding rules and which assignments to use.
This environment is based on Windows Server 2022. I didn’t test whether this issue applies to Server 2019 or Windows 10/11. If you have the same issue with another Operating System, please comment so other users know it applies to them.
During a recent project, I encountered an issue where the Snipping Tool would freeze upon clicking the options button. This occurred in a fully patched Windows Server 2022 environment (until KB5035857).
After troubleshooting the machine, I found no leads to direct me toward a solution. I then checked all my Server 2022 machines, including those without Citrix or RDSH, and discovered they all experienced the same issue, unlike the Server 2019 and 2016 machines.
I noticed a significant difference in the build numbers of the Snipping Tool.exe between the 2019 and 2022 versions. Transferring the 2019 version to the 2022 server resolved the problem.
To ensure I used a supported version of the Snipping Tool for Windows Server 2022, I began with a freshly installed server, tested the tool, and then sequentially installed all Microsoft updates. The Snipping Tool malfunctioned after the August 2023 update. Reviewing the versions released between August 2023 and March 2024 revealed updates to the Snipping Tool (as indicated by the build numbers). Still, the only change noticeable to users was a notification regarding the tool’s status (10.0.20348.2227).
I am using the Snipping Tool version before the August 2023 update with Build numbers (10.0.20348.1726).
In a previous article about using Entra ID B2B users, I needed to set up some user extensions. In that article, I also mentioned that many customers needed to access the environment, which meant that multiple users needed to be changed. As this is an ongoing business and we like to automate these kinds of tasks, I looked for an automation option. Creating a script wasn’t the problem, but using it to connect to Entra ID by entering credentials every time or putting them into the script was the hard part of the search.
After some time, I stumbled upon Azure Automation, where you can create runbooks (scripts) and add them to a scheduled task.
What’s Azure automation
Azure Automation is a cloud-based service provided by Microsoft Azure that enables users to automate various manual, time-consuming, and repetitive tasks across Azure and on-premises environments. It offers tools for process automation, configuration management, and update management. Using PowerShell or Python to automate complex tasks, users can create, schedule, and manage runbooks (workflow scripts). Azure Automation also provides Desired State Configuration (DSC) to define and enforce system configurations. Additionally, it integrates with other Azure services, allowing users to orchestrate end-to-end workflows for efficient and consistent management of resources in the Azure cloud.
Creating an Azure Automation account
We must create an Azure Automation account before using Azure Automation and deploying our script. Sign in to the Azure portal and search for Azure Automation. Click on “Create” and follow the wizard, here, you will see the output of “Review + Create.”
Powershell Modules
When using Azure Automation, verifying the availability of the intended PowerShell modules is crucial. For this situation, the following modules are essential:
Microsoft.Graph.Authentication: Responsible for establishing connections (Connect-MgGraph) and severing connections (Disconnect-MgGraph) to MgGraph.
Microsoft.Graph.Groups: Required for retrieving group members (Get-MgGroupMember) and obtaining information based on a group ID (Get-MgGroup).
Microsoft.Graph.Identity.SignIns: Necessary for initiating invitations (New-MgInvitation) through identity sign-ins.
Microsoft.Graph.Users: Utilized for fetching user information (Get-MgUser) and facilitating user updates (Update-MgUser).
To add the modules to the Azure Automation account, open the account and go to “Modules,” then follow the below steps to add the module. Repeat these steps until all required modules are added:
Click on “Add a Module”
Select “Browse from gallery”
Click on “Click here to browse from gallery”
Search the module and select
click on “Select” at the bottom of the page.
Now set the “Runtime Version”. In our case, it’s 5.1.
Click “Import”
To verify if all of the required Powershell modules are available or still being imported, click on “Modules” within the Azure Automation account.
After adding the appropriate Powershell modules, we need to add the required permissions.
Permissions
To get our script working with the appropriate permissions, we need the following permissions:
User.ReadWrite.All
User.Invite.All
Directory.Read.All
As we use a Managed Identity, we need to add the permissions using Graph Explorer. First, we must connect to the Graph Explorer: https://aka.ms/ge and sign in using an account in the correct tenant. You can’t switch tenant/directory when using Graph Explorer.
Step 1: Finding the Service Principal
We first need to find the service principal for our managed identity (automation account). To do so, we need to perform the following query, where displayname equals our Managed Identity name, in our case, “CTX-DaaS”:
HTML
GET https://graph.microsoft.com/v1.0/servicePrincipals?$search="displayName:CTX-Daas"
Write down the ID. We need this in step 4.
Step 2: Finding the Microsoft Graph Service principal ID
As the Microsoft Graph API always has the following AppId “00000003-0000-0000-c000-000000000000”, we can create a filter and only look for this AppId. We can use the following query to get the ID:
HTML
GET https://graph.microsoft.com/v1.0/servicePrincipals?$filter=appId eq '00000003-0000-0000-c000-000000000000'
Write down the ID. We need this in steps 3 and 4.
Step 3: Finding the Application Roles (Permissions) id
When you would like to use Graph Explorer, this is also possible, and you follow the below step: As we can’t add a filter to the query, we need to use the old-fashioned way and use CTRL+F to search for the appropriate IDs. First, we need to run the below query, replacing the {ServicePrincipalID} with the ID we found in step 2.
HTML
GET https://graph.microsoft.com/v1.0/servicePrincipals/{ServicePrincipalID}/approles
After running the query, we get a list of all the Application Roles. Luckily, the output only shows 100 results. Now, use CTRL+F to search for the permissions mentioned above and write down the IDs.
Step 4: Assigning the Application Roles to the Managed Identity
To assign the Application Roles, we need the following IDs:
PrincipalId (step 1)
ResourceId (step 2)
AppRoleIId (step 3)
To add the Application Roles to the Managed Identity, we need to use a POST query in combination with a Content-Type application/json in the header and the following request body:
HTML
POST https://graph.microsoft.com/v1.0/servicePrincipals/{your-graph-serviceprincipal-id}/appRoleAssignedTo
Repeat step 4 until you assigned all the Application Roles.
Step 5: Verifying the permissions in Entra ID
Now that we have set the Application Roles (Permissions), we can verify this within the Entra ID portal. To do so, we go to Entra ID, Enterprise Applications, then set the Application Type to “Managed Identities”.
Now, we select the Managed Identity we create to perform the Azure Automation tasks and go to Permissions. Here, we will see the permissions we just assigned.
Creating the runbook
Now that we have set all the appropriate permissions, we can start creating the runbook. For this case, we will look for users who are members of a certain security group called “CTX-DaaS-B2B” and verify if the Extension (Created in the previous blog) and CreationType are equal to Invitation. We will check this for each user who’s a member of the security group and check each setting, and if needed, we will apply changes.
Below, you can see the script we are using to automate the process of invitations and setting the extensions so the user can sign in to the Citrix Environment, which is mentioned in the previous blog.
PowerShell
Connect-MgGraph -Identity -NoWelcome$Group = "CTX-DaaS-B2B"$group = Get-MgGroup -Filter "DisplayName eq '$Group'"$members = Get-MgGroupMember -GroupId $group.Id$ExtensionUPN = "extension_c8ef4d2cb6694d76974c5a05be13affa_guestShadowUPN"$ExtensionSID = "extension_c8ef4d2cb6694d76974c5a05be13affa_guestUserOnPremSID"$CreationType = "Invitation"$RedirectURL = "https://newyard.Cloud.com"$UserMessage = "You have been invited to the Citrix New Yard environment. ` Redeem this invitation to get started and you’ll be redirected to the Citrix New Yard"foreach ($memberin$members) {$user = Get-MgUser -UserId $member.Id -Property ID, DisplayName, Mail, UserPrincipalName, UserType, CreationType, ExternalUserState, OnPremisesSecurityIdentifierif ($user.UserType -ne "Guest"){Write-output"User $($user.displayname) is Member, changing to Guest!"Update-MgUser -UserId $user.Id -UserType "Guest" } $UserExtensions = (Get-mguser -UserID $user.Id -Property $ExtensionUPN).AdditionalPropertiesif ($UserExtensions.keys -NotContains $ExtensionUPN) {Write-Output"User $($user.displayname) has nu Extension, updating UPN and SID Extensions"Update-mguser -UserId $user.id -AdditionalProperties @{$ExtensionUPN = $user.UserPrincipalName}Update-mguser -UserId $user.id -AdditionalProperties @{$ExtensionSID = $user.OnPremisesSecurityIdentifier} }Write-output"$($user.displayname) has CreationType $($user.creationType)"if ($user.CreationType -ne $CreationType) {Write-output"User $($user.displayname) isn't invited, inviting user!"New-MgInvitation -InvitedUser $user -InvitedUserEmailAddress $user.Mail -InviteRedirectUrl $RedirectURL -SendInvitationMessage:$true -InvitedUserMessageInfo:@{CustomizedMessageBody=$UserMessage} }}disconnect-MgGraph
We published the runbook to production once we tested it (using the test pane in the PowerShell runbook) and verified everything was correct. This can be done using the Publish button inside the PowerShell runbook.
Create a Schedule
Now that we have the runbook published, we can start with scheduling the runbook. As we start migrating users to the new environment, we have set the schedule to run every hour. When the migration is done, we can change the schedule to once a day, as we normally don’t make that many changes or create multiple users a day.
Let’s start with creating the schedule. We go to the Schedules from the Automation account and add a schedule.
Once the schedule is created, we need to assign the runbook to the schedule. Therefore, we go to the runbook, click on the “Link to schedule” button, click on the link to a schedule in your runbook, select the correct schedule, and click OK. To verify if the runbook is successfully linked to the schedule, click “Schedules” within the runbook and verify that the correct schedule is linked.
Verify the Azure Automation Task
Now that we successfully have followed all the steps:
Creating Azure Automation Account
Add PowerShell Modules
Set permissions
Create the runbook
Create and linked a Schedule
We can now see in the overview of the Azure Automation account if the runs are completed successfully. As you can see in the screenshot below, the job has run once and was successful. The users will receive an invite every hour unless they already received one.
Conclusion
We can conclude that Azure Automation is a powerful tool for automating repetitive tasks, such as sending invites to new guests. I couldn’t achieve this without the guidance of GoToGuy’s blog, where the Graph Explorer is explained to configure all the necessary permissions. I will undoubtedly utilize Azure Automation more frequently now that I know its capabilities.
In a recent project, I designed an environment for a client who offers a published application to a substantial customer base. They expressed their desire to incorporate the use of Entra ID. I recommended the adoption of Entra ID B2B accounts for this purpose. Entra ID B2B accounts enable access to external partners for applications and resources within the organizational environment.
About Entra ID B2B
Entra ID B2B, short for Business-to-Business, is a Microsoft cloud service that facilitates secure collaboration among organizations. It permits organizations to extend invitations to external entities, such as partners, suppliers, or customers, granting them access to resources hosted within their Entra ID environment (formerly Azure AD). This approach eliminates the necessity for external users to establish new accounts, streamlining the collaborative process. Entra ID B2B bolsters security by implementing conditional access policies and multi-factor authentication for guest users, ensuring that only authorized individuals can access shared resources. Read here for more about Entra ID B2B.
Issues during implementation
While implementing this solution within the Citrix Cloud using the default Entra ID connect option, I encountered issues during the sign-in process with B2B accounts. After asking around, Julian Jakob directed me to an article highlighting an issue related to the default Entra ID connection in conjunction with Citrix FAS (Federated Authentication Service).
Upon reaching out to Citrix for support, it was revealed that there had been a modification to their SAML 2.0 configuration. This change now offers the option to utilize UPN (User Principal Name) or SID (Security Identifier) for authentication, whereas both were previously required. Please refer to the screenshot below for visual reference to this configuration option.
Based on the information provided, we successfully implemented Entra ID B2B accounts. However, this implementation necessitated certain modifications to our On-Premises Active Directory. Specifically, to grant B2B access to individual users, updating their User Principal Name (UPN) to align with their respective email addresses was imperative. While this approach was effective, it may not be ideal, especially considering the potential complexities of managing multiple DNS suffixes within the environment. As such, I dedicated some time to exploring alternative solutions that would obviate the need for such changes within the Active Directory.
Earlier in the project, I had come across an article from Citrix that introduced the concept of a ClaimsMappingPolicy. This approach leverages Azure AD Preview modules and a SAML Tech preview. It is worth noting that I refrained from testing this solution previously due to its preview status. Please refer to the provided article link for detailed information on this alternative approach.
Configuring the Entra ID as IdP
To initiate the configuration of Entra ID as the Identity Provider (IdP), the following key steps must be taken:
Enterprise Application Setup: The first step involves the configuration of the Enterprise Application within the designated environment.
Microsoft Graph Integration: Integrating with Microsoft Graph services is essential to facilitate the required functionalities.
Claims Policy Mapping: Mapping of claims policies is necessary to define and manage the assertion of identity attributes.
SAML 2.0 Connection in Citrix Cloud: Establishing a SAML 2.0 connection within the Citrix Cloud infrastructure enables secure authentication and authorization processes.
Enterprise Application Setup
To initiate the setup and configuration of a custom Entra ID Enterprise SAML application, please follow the provided steps. I have followed the guide created by Citrix, which can be accessed.
Azure Portal Sign-In:
Sign in to the Azure portal.
Navigate to Entra ID:
In the portal menu, select “Entra ID.
Access Enterprise Applications:
From the left pane, under “Manage,” select “Enterprise Applications.”
Create New Application:
In the working pane’s command bar, select “New Application.”
Choose Custom Application:
Select “Create your own application” from the command bar. Avoid using the Citrix Cloud SAML SSO enterprise application template, as it restricts modifications to the list of claims and SAML attributes.
Application Name and Integration:
Enter a name for the application.
Select “Integrate any other application you don’t find in the gallery (Non-gallery).”
Click “Create.” This action will lead you to the application overview page.
Configure Single Sign-On (SSO):
From the left pane, select “Single sign-on.”
In the working pane, choose “SAML.”
Basic SAML Configuration:
In the “Basic SAML Configuration” section, select “Edit” and configure the following settings:
In the “Identifier (Entity ID)” section, select “Add identifier” and input the value associated with the region in which your Citrix Cloud tenant is located:
For European Union, United States, and Asia-Pacific South regions, enter: https://saml.cloud.com.
In the “Reply URL (Assertion Consumer Service URL)” section, select “Add reply URL” and enter the value associated with your Citrix Cloud tenant’s region.
After configuring these settings, select “Save” from the command bar.
User and Group Assignment:
From the left pane, select “Users and Groups.”
In the top bar, select “Add user/group.”
Under “Users,” select “None Selected.”
Choose the user you wish to test with, and click “Select.”
Click “Assign.” This action allows the designated user to be used for testing purposes.
These steps will enable creating and configuring a custom Entra ID Enterprise SAML application for your specific requirements.
Microsoft Graph Integration
The Citrix article I came across relies on Azure AD PowerShell modules, which have been planned for deprecation. To conduct effective testing, I needed to use something that was officially supported to ensure reliable functionality. Consequently, I embarked on a transition journey, migrating all the Azure AD commands to Microsoft Graph. Fortunately, I stumbled upon an informative article outlining the Microsoft Graph commands, which are direct replacements for the deprecated Azure AD Commands. This migration strategy aligns with industry best practices and guarantees continued operational effectiveness by leveraging the Microsoft Graph framework while circumventing the challenges posed by deprecated Azure AD PowerShell modules.
Modules
After migrating all Azure AD commands to Microsoft Graph, I comprehensively understood the requisite modules. Below, you’ll find an enumeration of all the essential commands, complete with their corresponding modules and the associated permissions indispensable for their execution:
Command
Module
Permission
get-mguser
Microsoft.Graph.Users
User.Read.All
Update-mguser
Microsoft.Graph.Users
User.ReadWrite.All
Get-MgApplication
Microsoft.Graph.Identity.SignIns
Application.Read.All
Get-MgApplicationExtensionProperty
Microsoft.Graph.Applications
Application.Read.All
Get-MgServicePrincipa
Microsoft.Graph.Applications
Application.Read.All
New-MgApplicationExtensionProperty
Microsoft.Graph.Applications
Application.ReadWrite.All
New-MgServicePrincipalClaimMappingPolicyByRef
Microsoft.Graph.Applications
Application.ReadWrite.All
new-MgPolicyClaimMappingPolicy
Microsoft.Graph.Identity.SignIns
Policy.ReadWrite.ApplicationConfiguration
Get-MgPolicyClaimMappingPolicy
Microsoft.Graph.Identity.SignIns
Policy.Read.All
To initiate work with Microsoft Graph, the first step involved the installation of the necessary modules. To achieve this, an Elevated PowerShell session needs to be opened, and the following commands need to be executed:
You can append- force to each command if you prefer to streamline the installation process without individual prompts.
Permissions
When establishing a connection to Microsoft Graph, it is essential to define the scope, which dictates the permissions granted for that session. As illustrated in the table above, a comprehensive set of permissions is required, which can be summarized as follows:
Policy.Read.All
Application.ReadWrite.All
Policy.ReadWrite.ApplicationConfiguration
User.ReadWrite.All
With these permissions delineated, we can proceed to connect to Microsoft Graph. To initiate this process, follow these steps:
Open an Elevated PowerShell window.
Connect to Microsoft Graph using the following command:
A popup window will appear upon executing the command, prompting you to authenticate and grant Microsoft Graph the specified permissions, as detailed in the command above.
This authentication step ensures the session has the necessary privileges to interact with Microsoft Graph effectively.
Configuring Claims Policy Mapping
Now that we are ready to configure the necessary settings for the Claims Mapping Policy, we continue from the PowerShell window we previously used to connect to Microsoft Graph.
Claims Mapping Policies are used in identity management to map and transform attributes (claims) exchanged between identity providers and service providers during authentication. They enable customization of how user attributes are processed, ensuring proper handling and alignment of user data between systems for secure access control.
Preparing the Application
To utilize the Claims Mapping Policy for Entra ID B2B users, where the User Principal Name (UPN) and Security Identifier (SID) do not align with the On-Premises Active Directory, we need to edit Entra ID with specific Extensions and bind them to the application. In this context, we will require the name of the previously created application, which, for this demonstration, is referred to as “CTX-DaaS-B2B.”
Retrieve the application using the following command:
Add the Claims Mapping Policy by providing the policy in JSON format. Before doing so, replace the ExtensionIDs with the Extension names obtained in step 3. Take note of the following names:
This process ensures the application has the necessary Extensions and Claims Mapping Policy to handle Entra ID B2B user scenarios where UPNs and SIDs do not correspond to the On-Premises Active Directory.
Obtain the Service Principal Proceed by retrieving the Service Principal with the following command, where Displayname is the name of the app, in this case CTX-DaaS-B2B:
Create the Claims Mapping Policy Now, create the Claims Mapping Policy, utilizing the $Params variable, which represents the variable name assigned to the Policy when pasting the JSON code. Execute the following command to establish the policy:
Map the Claims Policy to the Service Principal A Graph body parameter must be created using the OData protocol to map the Claims policy to the Service Principal. Follow these steps:
Execute the command below, replacing “CitrixCustomClaims” with the name you assigned to your Claims Policy (found in the JSON command. In this case, it’s “CitrixCustomClaims“):
PowerShell
Get-MgPolicyClaimMappingPolicy -Filter "Displayname eq 'CitrixCustomClaims'"| select DisplayName,Description, Id
Retrieve the ID and modify the following URL, replacing the ID at the end with the one obtained in the previous step:
https://graph.microsoft.com/beta/policies/claimsMappingPolicies/<ID from previous step>
Create a variable in PowerShell with the modified URL, like so:
These steps facilitate the establishment of the Claims Mapping Policy and its connection to the Service Principal, ensuring proper configuration for your desired scenario.
Configuring the User
To configure the test user, follow the steps outlined below:
Retrieve Current UPN and SID: Execute the following command to obtain the current User Principal Name (UPN) and Security Identifier (SID). Replace “[email protected]” with the Entra ID UPN you are configuring (for this test, it’s “[email protected]“):
Update User Extensions: Now, proceed to update the user’s Extension with the appropriate values. Replace the Extension name with the one you acquired earlier and execute the following commands:
These steps ensure the test user’s proper configuration, including updating relevant Extensions with the required values, effectively facilitating the Entra ID B2B user scenario.
Configuring the SAML 2.0 Connection in Citrix Cloud
With Entra ID prepared, let’s proceed to configure Citrix Cloud. To initiate this process, follow these steps:
Sign in to the Citrix Cloud administration console and navigate to the Single Sign-On settings of the previously created Enterprise App.
Access the Citrix Cloud Console, and under “Identity and Access Management,” click on “Connect” within the SAML 2.0 option.
Create a Custom Administrator URL.
In the SAML configuration, populate the following settings using the Single Sign-On URLs obtained from Entra ID within the Enterprise App:
Identity Provider Entity ID = Microsoft Entra ID Identifier.
SSO Service URL = Login URL.
Logout URL (optional) = Logout URL.
Refer to the screenshot below for reference. On the “Set up single sign-on with SAML” page, in the SAML Signing Certificate section, locate “Certificate (PEM)” and select “Download” to save the certificate to your computer. Upload this certificate to the X.509 section of the SAML configuration within Citrix Cloud.
Proceed by clicking “Test and Finish.”
Now that SAML 2.0 has been configured, it is necessary to update the Workspace Configuration to employ the newly created authentication method:
Within the Citrix Cloud Administration Console, navigate to “Workspace Configuration.”
Proceed to the “Authentication” tab.
Select “SAML 2.0” and confirm your selection by clicking “OK” in the pop-up dialog. Allow a brief moment (approximately 1 minute) before proceeding, as immediate functionality may not always be observed.
By following these steps, you configure the SAML 2.0 connection in Citrix Cloud, enabling seamless integration with Entra ID for authentication and access management.
Testing
Now that we have configured all required settings, we can start with the test.
Visit the Workspace URL and sign in using the test user employed previously:
After launching the desktop, open a command prompt or PowerShell and run the “whoami” command. You will observe that the sign-in is performed using the local Active Directory (AD) account.
Conclusion
After an extensive search and engagement with Citrix support, a functional Entra ID B2B scenario has been successfully resolved. While the initial solution relied on outdated information, it has been updated with current PowerShell commands, resulting in a functional setup. Consequently, organizations can seamlessly integrate Entra ID B2B and regular Entra ID users into their Citrix Cloud deployments, enhancing their access and identity management capabilities.
Please feel free to comment or ask questions regarding this solution. I am working on another article to guide you on using Azure Automation to update all B2B users with the new extension. So, stay tuned. UPDATE: As mentioned earlier, I would create a blog post about using Azure Automation to automate this. You can read it here.
As I deploy more and more Citrix FAS Servers for customers who intend to utilize Entra ID (formerly known as Azure AD) as their Identity Provider (IdP), I have observed that the FAS Authorization Certificate requires periodic renewal. To ensure that expired certificates do not inconvenience my customers and to maintain a level of automation and security, I have developed a simple script that notifies them when a new FAS Authorization Certificate is needed and mails them to issue it through their Active Directory Certificate Services (ADCS).
The current FAS Renewal script is designed to run locally on the FAS Servers. In the future, I plan to enhance it to allow for remote execution from a management server or similar setup. However, for the current version, you must follow these steps on all your FAS servers individually.
Automation and Security Considerations
In today’s technology landscape, automation is fundamental to organizational efficiency. However, maintaining a degree of control over critical components like your Certificate Authority is equally important. To strike a balance, I have designed this script to handle the certificate renewal process up to a certain point and then notify you via email, granting you the authority to issue the certificate from your ADCS.
Script Functionality
When I set out to create this script, my primary objective was to automate the entire certificate renewal process. The script follows a logical sequence of actions:
Checking the Expiry Date: The script evaluates the expiration date of the current FAS Authorization Certificate.
Requesting a New Certificate: If the current date approaches the expiration date, the script initiates a request for a new certificate.
Connecting to ADCS: The script establishes a connection to your Active Directory Certificate Services (ADCS), ensuring that the certificate issuance process is orchestrated securely.
Updating FAS Configuration: Once the new certificate is obtained, the script updates the Citrix FAS configuration to utilize the freshly issued certificate seamlessly.
Enhanced Security
During my research on certificate issuance using scripts, I came across discussions highlighting the importance of maintaining control over such processes. I agree with this perspective. The FAS Authorization Certificate is a Certificate Request Agent, which implies that it can potentially be used to request certificates for all users within your environment. In the event of compromise, this could pose a significant security risk, allowing unauthorized access to your organization’s resources. To mitigate this risk, I made a deliberate choice to implement a user-based approach. This ensures that the issuance of certificates is controlled and limited to authorized personnel, reducing the potential for misuse and bolstering the security of your organization’s infrastructure.
Prepare Mail User
In order for the script to successfully send emails to a predefined contact, it is crucial to ensure that the sent emails are delivered without being flagged as spam. To achieve this, authentication against the mail server is necessary. When sending unauthenticated emails through Office 365, there is a high likelihood that such emails will be routed to the recipient’s spam folder, resulting in them going unnoticed. While I have utilized my personal Office 365 account for this demonstration, it is possible to configure the script to work with an on-premises Exchange server as well. However, there are certain prerequisites that must be met to ensure successful user authentication for email transmission.
Account Setup
To establish a connection to a user’s mailbox, it is essential that the user account be enabled. Typically, shared mailboxes are configured as disabled users, but an enabled user account is required for this script to function effectively. Given that most organizations have password expiration policies, it is essential to use an Application Password to prevent script failures due to password changes. To create an Application Password, please refer to the provided link and ensure that you assign a meaningful name to it for identification purposes; in this example, “FASRenewalMail” was used.
Additionally, it is crucial to verify whether the user is permitted to connect to “Authenticated SMTP” by following the instructions provided in the linked resource.
Mail Configuration and Testing
Once the user meets all the specified prerequisites, we can proceed with configuring the mail functionality on the FAS servers. To ensure the best security and prevent passwords from being stored directly within the script, especially since scheduled tasks can run under different user accounts, we adopt a secure approach by creating an encrypted text file. This encrypted text file is the most secure method for password storage as it can only be accessed by the user who created it and exclusively on the machine where it was generated. Copying the file to another location or machine does not grant access to its contents.
To create the encrypted text file, please utilize the provided code snippet:
The encrypted text file has been saved to the directory C:\Scripts. If you prefer a different location, you have the flexibility to make that adjustment. However, it’s important to note that if the scheduled task is created under a different user account, you must initiate a PowerShell session under that specific user. Failure to do so will result in the script’s inability to retrieve the password, leading to script execution failure.
Once the encrypted text file has been successfully created, you can verify its functionality by conducting a test email transmission. Use the provided code snippet below for sending a test email, and remember to replace the $TestUser variable with the designated user you are using for this test:
If this testing phase proceeds without issues, you can proceed with the FAS Renewal Script implementation.
FAS Renewal Script:
The script relies on several configurable variables that must be set before execution to ensure its proper functioning. One critical variable to adjust is “ThresholdDays,” which should be tailored to your specific requirements. I recommend setting it to 30-60 days to ensure renewal well before the certificate expiration date. During testing, I used a value of 730 days, as newly created certificates are valid for 329 days. This extended period allows for multiple tests until you are satisfied.
Additionally, ensure that the “CredentialFile” variable points to the same location where you saved the encrypted text file in the previous step.
# Load Citrix FAS snap-inAdd-PSSnapin Citrix.A*# Configuration$ThresholdDays = 730$SleepTime = 10$AdminEmail = "[email protected]"$ToEmail = "[email protected]"$CredentialFile = "C:\Scripts\cred.txt"$SmtpServer = "smtp.office365.com"$SmtpPort = 587# Get current certificate information$CitrixFasAddress=(Get-FasServer)[0].Address$DefaultCA=(Get-FasMsCertificateAuthority -Default).Address$CurrentCertificate = Get-FasAuthorizationCertificate -FullCertInfo$DaysDifference = ($CurrentCertificate.ExpiryDate - (Get-Date)).Days# Load credentials from file$SecurePassword = Get-Content$CredentialFile | ConvertTo-SecureString$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $AdminEmail, $SecurePassword# Function to send email notificationsfunctionSend-EmailNotification {param ( [string]$Subject, [string]$Body )Send-MailMessage -From $AdminEmail -To $ToEmail -Subject $Subject -Body $Body -SmtpServer $SmtpServer -Credential $Credential -UseSsl -Port $SmtpPort}if ($DaysDifference -lt $ThresholdDays) {# Request a new certificate$NewCertificate = New-FasAuthorizationCertificate -CertificateAuthority $DefaultCA -CertificateTemplate "Citrix_RegistrationAuthority" -AuthorizationTemplate "Citrix_RegistrationAuthority_ManualAuthorization"# Send email notification about pending certificateSend-EmailNotification -Subject "Pending FAS Authorization Certificates | $CitrixFasAddress" -Body "There is a pending FAS authorization certificate. Please issue the certificate on server $($CurrentCertificate.Address) so that FAS keeps working. After issuing, everything is automated and you receive a mail if everything was succesful."# Wait for certificate to be issueddo {# Get all FAS authorization certificates$Certificates = Get-FasAuthorizationCertificate# Check if there are certificates with a status other than "Ok"$CertificatesNotOk = $Certificates | Where-Object { $_.Status -ne "Ok" }if ($CertificatesNotOk.Count -gt 0) {# Wait for a 1 hour before checking againStart-Sleep -Seconds $SleepTimeWrite-Output"Waiting for certificate to be issued" } } while ($CertificatesNotOk.Count -gt 0)# All certificates have status "Ok"Write-Output"All Certificates are Issued."# Remove the old Certificate and update the Certificate DefinitionWrite-Output"Removing the old Certificate"Remove-FasAuthorizationCertificate -Id $CurrentCertificate.Id$CertificateDefinition = (Get-FasCertificateDefinition)[0].NameWrite-Output"Updating the Certificate Definition with the new Certificate"Set-FasCertificateDefinition -Name $CertificateDefinition -AuthorizationCertificate $NewCertificate.Id# Send email notification about certificate renewal$CurrentCertificateNew = Get-FasAuthorizationCertificate -FullCertInfo$DaysDifferenceNew = ($CurrentCertificateNew.ExpiryDate - (Get-Date)).DaysSend-EmailNotification -Subject "FAS Authorization Certificates Renewed | $CitrixFasAddress" -Body "The FAS authorization certificate is renewed. It's now valid for $($DaysDifferenceNew) days until $($CurrentCertificateNew.ExpiryDate)."} else {Write-Output"Certificate expiration date is $($CurrentCertificate.ExpiryDate)"}
After configuring these variables, save the script to the C:\Scripts directory.
Create a scheduled task.
Configure the scheduled task with a trigger that aligns with your customer’s preferences. I suggest selecting a workday for scheduling and avoiding Mondays, as mailboxes may be full, potentially causing the FAS Renewal Mail to go unnoticed. Configure the task to execute as follows:
Use the same account that was used for creating the encrypted text file; otherwise, the script will not function as expected.
Conclusion
With these steps completed, you are now equipped to receive timely notifications when an FAS Authorization Certificate is approaching its expiration date. If you have any comments or questions, please feel free to reach out for further assistance.
When working with Machine Creation Services we discovered that the Clients all have the same CMID. A different CMID is needed to activate using KMS services. We searched for a solution and everyone mentioned rearming the machine, but then we received the message that we can’t rearm. We then searched for a solution but everyone was pointing out to set the Skiprearm to 1. When testing this we discovered that the CMID keeps the same after rearming the Windows machine.
We discovered that the “Remaining Windows Rearm Count” was 0. which means that when you set SkipRearm then you don’t rearm the machine. Knowing this we searched how to reset the Rearm count and found an article that explained how we could reset the rearm count. http://www.daniel-mitchell.com/blog/reset-windows-7-rearm-count/
Restart your machine to repair mode using the F8 key.
In the System Recovery Options menu, select Command Prompt.
Now type in D:\reset.bat. The C-drive is used as a recovery partition.
If the script is executed successfully, you should get the message “The operation completed successfully”.
Reboot your machine.
Now the rearm count is reset to 5. You can check this using slmgr /dlv.
Now set make sure the Skiprearm is set to 0 at the following location: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\WindowsNT\CurrentVersion\SoftwareProtectionPlatform
Now the Image is ready to deploy and the KMS server will receive different CMID’s from the servers.
With Windows server 2012 and Windows 8 Microsoft added some new features, they also created the proper Group Policy configuration options for this. With Windows server 2012 and Windows 8 they added 160 new items which are only compatible with Windows server 2012, Windows 8 or Windows RT. They added a total of 350 items which are compatible with earlier versions and the new Windows Server 2012 and Windows 8.
Microsoft Deployment Toolkit (MDT) 2012 Update 1 it’s possible to deploy Windows 8, Windows 7, Office 2010 and 365, Windows Server 2012, and Windows Server 2008 R2 in addition to deployment of Windows Vista, Windows Server 2008, Windows Server 2003, and Windows XP within your organization.
After downloading the MSI file start the installation, it’s basicly a next,next,finish installation.
1) On the Welcome Screen click Next.
2) On the End User License Agreement, Accept the license terms and click Next.
3) Because I work in a test environment I don’t want to use the Customer Experience Improvement Program, so I select “I don’t want to join the program at this time” and click Next.
In previous post we Installed VAMT and then added licenses and devices, now we will create the possibility to use Active Directory-Based Activation. To use the Active Directory-Based Activation you need at least one Windows Server 2012 domain controller.
Installing Active Directory-Based Activation
To use AD-Based Activation you need to install the Volume Activation Services Server role. This can be done using the Server Manager. In the Server Manager click on Manage and then click Add Roles and Features.
Select Role-based of feature-based installation and click Next.
Select the server which you want to install the Volume Activation Services on from the server pool and click next.
Select the Volume Activation Services in the Server Roles screen and click Next.
A new screen (pop-up) will appear which displays the required features just select Add Features.