Use Power Automate to automatically create SharePoint News Links from an RSS feed

Use Power Automate to automatically create SharePoint News Links from an RSS feed

This article is contributed. See the original author and article here.

There’s a blog for that


A somewhat common complaint I’ve heard from organizations I’ve worked with is that folks within the organization frequently are unaware of press releases, blogs, or other information the organization is publicly sharing. In fact, I’m guilty of it as well. On numerous occasions, I’ve gone to a coworker for some quick troubleshooting only to be told “I wrote a blog for that”.


 


Now that Microsoft Viva Connections is here, I’ve been putting a lot of energy into my company’s SharePoint home site and trying to come up with ways to break down the information silo’s we’ve just naturally accrued over the years.


 


Fortunately, it turned out that our company blog already had an RSS feed setup, which opened up some opportunities, one of which was to create a flow in Power Automate that automatically creates a SharePoint “News Link” in our home site whenever a new blog post is published to our public site.


 


So, with this blog, we’ll walk through the steps used to accomplish that feat.


 


FlowOverview.png


 


Triggered


As with any flow, we need something to kick things off. I was afraid that this was going to be the biggest technical challenge but, thankfully, it turns out that there is a trigger purpose built to do exactly what we need: the When a feed item is published trigger!


 


1-trigger.png


 


As you can see, the configuration here is dead simple. You simply provide it the URL to an RSS feed and select either the PublishDate or UpdatedOn values. We’ll stick with the default PublishDate setting so that we’re only being triggered by brand new articles.


 


So, with this configuration, our flow will be executed anytime a new article is published to the XBOX news RSS feed.


 


Once triggered, seemingly regardless of the specific RSS feeds schema, a standardized JSON object is returned to the flow that gives us most of what we need.


 


{
“body”: {
“id”: “https://news.xbox.com/en-us/?p=152438”,
“title”: “Wasteland 3: The Battle of Steeltown Releasing June 3 “,
“primaryLink”: “https://news.xbox.com/en-us/2021/04/15/wasteland-3-the-battle-of-steeltown-releasing-june-3/”,
“links”: [
“https://news.xbox.com/en-us/2021/04/15/wasteland-3-the-battle-of-steeltown-releasing-june-3/”
],
“updatedOn”: “0001-01-01 00:00:00Z”,
“publishDate”: “2021-04-15 14:00:00Z”,
“summary”: “The Wasteland 3 team here at inXile is very excited to announce the first narrative expansion for Wasteland 3: The Battle of Steeltown will be releasing June 3. Since the game’s launch last August, we’ve been working on adding new features, quality of life changes, and fixing bugs and improving game stability and performance. But […]”,
“copyright”: “”,
“categories”: []
}
}

 


Even better, this data gets turned into variables we can access through the Dynamic Content selector in Power Automate.


 


1-blog-properties.png


 


Take a picture, it’ll last longer


One thing we don’t get is any sort of image to show, which is a bummer because without them, all of our News Links would end up looking like the below image.


 


3-blog-no-image.png


 


Thankfully, SharePoint has a handy-dandy little service hidden away that can help.


 


If you ever created a new “News Link”, you’ll know that you simply give SharePoint the URL to your article and it auto-magically snags the title, summary and a thumbnail image to use. If you open up your browser’s developer tools, you can see that SharePoint calls this _api/SP.Publishing.EmbedService/EmbedData endpoint, passing along an encoded URL and some additional query strings. It turns out that this is what handles all that ‘magic’ and it’s also something we can leverage for our own ends here!


 


Thanks to the output of our trigger, we know the URL of the blog post we’re working with, and we can access it through the 5-primaryfeedlink.png variable. However, we do need to make sure that the URL is in the right format, so we’ll create our own variable to make it so.


 


4-primarylinkencoded.png


 


We’ll call it PrimaryLinkEncoded, make it a string, and initialize its value using the following expression: concat(‘%27’,encodeUriComponent(triggerOutputs()?[‘body/primaryLink’]),’%27′)


 


Once run, we’ll end up with an encoded URL surrounded by apostrophes, which is what the EmbedData service expects.


 


Now that we have that we just need to call the aforementioned service using the Send an HTTP request to SharePoint action.


 


6-getthumbnail.png


 


We’ll be making a GET request to the root of our SharePoint site. Technically, this could be any SharePoint site you have access to, but since we’ll be posting news articles to our home site, we’ll just stick with that.


 


For the Uri configuration, we’re calling the previously mentioned service with a few required query string parameters like so: _api/SP.Publishing.EmbedService/EmbedData?url=@{variables(‘PrimaryLinkEncoded’)}&version=1&bannerImageUrl=true


 


We’re passing along the encoded URL we created in the last step, specifying version 1 (which is required, despite their only being one version) and we’re asking for the bannerImageUrl to be included (otherwise we’re not getting )


 


We only need to include one header, the accept header, with a value of application/json;odata.metadata=minimal.


 


Finally, to make things a bit easier to use in a moment, we’ll capture the output of this request into a variable using the Initialize Variable action again, like so.


 


7-BannerImageUrl.png


 


We’re creating a new string variable named BannerImageUrl and we’re setting its value using the following expression: outputs(‘Get_Thumbnail’)?[‘body’]?[‘d’]?[‘ThumbnailUrl’]


 


Compose yourself


Now that we’ve got just about everything we can get, we need to put into the format that SharePoint expects when creating a News Link item, so it’s time to prepare our payload using the Compose action.


 


8-compose.png


 


It’s a fairly simply and (mostly) self-explanatory bit of JSON, so we won’t dwell on it much. Below is the exact JSON used in the above screenshot.


 


{
“BannerImageUrl”: @{variables(‘BannerImageUrl’)},
“Description”: @{triggerOutputs()?[‘body/summary’]},
“IsBannerImageUrlExternal”: true,
“OriginalSourceUrl”: @{triggerOutputs()?[‘body/primaryLink’]},
“ShouldSaveAsDraft”: false,
“Title”: @{triggerOutputs()?[‘body/title’]},
“__metadata”: {
“type”: “SP.Publishing.RepostPage”
}
}

 


Spread the word


The only thing left to do now is make our post, which will do by using another Send an HTTP request to SharePoint action, shown below.


 


9-post.png


 


This time, we’ll be making a POST to the _api/sitepages/pages/reposts endpoint (which is what SharePoint does when you post a news link).


 


Our headers are only slightly more involved. Our endpoint is expecting to receive and will return JSON, so we need to include the appropriate headers…


 


{
“accept”: “application/json”,
“content-type”: “application/json;odata=verbose;charset=utf-8”
}

 


Last but not least, we need to include the Output of the compose action we created in the previous step so that SharePoint knows what we’re sharing.


 


Once that’s all setup, go ahead and save.


 


Wrapping up


At this point, you’re done developing. The only thing left to do is wait, really. Once new items are published to the RSS feed, you’ll eventually see them start showing up in your News web parts!


 


10-done.png


 

Hidden Treasure Part 2: Mining Additional Insights

Hidden Treasure Part 2: Mining Additional Insights

This article is contributed. See the original author and article here.

Written by Jason Yi, PM on the Azure Edge & Platform team at Microsoft. 


Acknowledgements: Dan Lovinger, Principal Software Engineer


 


On the last episode of discovering hidden treasure, we took a closer look at what type of data lies within the DiskSpd XML output. Today, we will examine an example of how to take advantage of that data and create new and practical insights.


 


DiskSpd on Azure


Let’s say that we are using Azure VMs to simulate some workload using DiskSpd. To visualize the data, let’s go ahead and use a short script that takes the XML output and extracts the total IOs per bucket into a CSV file for a more graphical view.


Picture1.png


 


As you can see, the IOPS are relatively constant, with an occasional bump. The reason is because we are maxing out the total number of IOPS on our Azure environment (3-node cluster using Standard B2ms) can handle. Azure also artificially throttles the IOPS limit based on your VM size and drive type. In our case, the VM limit is 1920 IOPS and you can see that our peak is ~1950 IOPS. The occasional spike and drop in IOPS is likely due to Azure attempting to rebalance itself and locate the throttle limit.


Using Azure VMs, we can see that the IOPS values are relatively constant, but that’s not very interesting nor is it representative of a real workload. The workloads in the real world are much messier and random. Perhaps there is a way to replicate random IO activity to represent a typical day to day activity. Well, you are in luck, because there is a script for that – Let’s try it!


 


Randomize IOPS experiment


Note: The IOPS variance is purely artificial and for educational purposes only. By no means does this replicate any real-world IO scenario.


 


To help demonstrate this experiment, I’ve written a short script called “iops_randomizer.ps1”, to simulate random IO activity. The script uses a set of parameters to run DiskSpd in short, one second bursts. The IO values are randomized each second by using the (-g) parameter to throttle the throughput, which in turn affects the IOPS limit. Here are the parameters for the script:



  • -d (mandatory) = The number of DiskSpd tests. Because each test run corresponds to one second, you can think of this as the total duration of the script.

  • -path (mandatory) = the path to the test file.

  • -rw_flag = Takes in one of two options, zero or one. 0 represents that the user wants to input their custom read/write ratio whereas 1 represents that the user wants a randomized read/write ratio, without providing the -w parameter value. The default selection will be 0 and if the user does not provide a complementary -w parameter value, the script will use a default value of -w 0 (100% read).

  • -g_min = The minimum value possible when randomizing the throughput (defines the min range). The default value is 0 bytes per milliseconds.

  • -g_max = The maximum value possible when randomizing the throughput (defines the max range). The default value is 8000 bytes per milliseconds.

  • -b = The block size in bytes. The default is 4096 bytes (4KiB).

  • -r = The random I/O aligned to the specified size in bytes. The default is 4096 bytes (4KiB).

  • -o = The outstanding IO requests per target per thread. The default is 32.

  • -t = The number of threads per target file. The default is 4.

  • -w = The percentage of operations that are write requests. The default is 0% writes, 100% reads.


 


Note: You may find that your IOPS values are ridiculously small. This is because the default parameters are not optimized to your powerful environment. Consequently, you may need to experiment with the (-g) parameter range. Remember that because they are in bytes per milliseconds, you will need to perform some unit conversion to confirm that you are efficiently randomizing your values.


 


Here is the conversion I used:


Picture2.png


 


Let’s now try running the following script:


Picture3.PNG


 


After about 120 seconds, you should see 3 files in your current directory.



  • expand_profile.xml : This file is created when the script is first run and contains all the DiskSpd test runs with their respective parameters. This is later fed into DiskSpd as an input. As a result, the file only contains the <Profile> element. You may use this file to modify any parameters you desire and feed it back into DiskSpd.

  • output.xml : This is the finalized output file that is created after the DiskSpd test is complete.

  • iops_stat_seconds.csv : This file contains the clean data for the number of IOs for each second the DiskSpd test was run.


Now that we have the csv output, we can create a graph that plots total IO vs time (seconds). We now have some variance in the number of IOs!


Picture4.png


 


IO Percentiles


As you’ve just seen, there is potential in experimenting with the xml output. Perhaps you wish to derive other data that may be valuable for your situation. For example, maybe we want to examine the percentile values of the IO operations. Let’s actually try it, we have a second script called “get_iops_percentile.ps1” that takes the iops_stat_seconds.csv file and calculates the percentile scores for the IO values. After running the script, you should see a file called iops_percentiles.csv as well as a copy of the output on the PowerShell terminal.


Picture5.png


 


These percentile values can help us understand the different segmentations of IO values, gauge the average IO output for each second, and identify trends. In our example, we can see that 99% of the IOPS are less than ~1635.


 


Bonus: rw_flag


This section is to provide more information on the rw_flag to clear up any potential confusion. You may be wondering what is the difference between using 0 and 1?


 


The main difference is that with an rw_flag of 0, you the user, can provide an additional write to read ratio parameter (-w) value. For example, if you provide 30, this means 30% of the IO will be writes and 70% of the IO will be reads. This also means that every DiskSpd test will use 30 as the write to read ratio, producing a consistent result between read IOs and write IOs in the long run.


 


However, with an rw_flag of 1, the user does not need to specify any read/write ratio. Instead, the ratio is randomized each second between 0% and 100%.


 


Using the performance monitor within Windows Admin Center, the result may look something like this: (left side uses rw_flag=0, right side uses rw_flag=1)


Picture6.png


Final remarks


Today’s experiment was one example of extrapolating new data from the XML output. If you believe DiskSpd is not giving you a specific metric and wish to infer other data, this may be one method of manually discovering new “treasures.” Have fun!


 


*Script 1: iops_randomizer*

# Written by Jason Yi, PM

<#
.PARAMETER d
integer number of diskspd runs (can consider it as duration since each run is one second long)
.PARAMETER path
the path to the test file
.PARAMETER rw_flag
the default is 0. 0 represents that the user wants to input their custom read/write ratio whereas 1 represents that the user wants a randomized read/write ratio
.PARAMETER g_min
the minimum g parameter (g parameter is the throughput threshold)
.PARAMETER g_max
the maximum g parameter (g parameter is the throughput threshold)
.PARAMETER b
the block size in bytes
.PARAMETER r
random IO aligned to specified size in bytes
.PARAMETER o
the queue depth
.PARAMETER t
the number of threads
.PARAMETER w
the ratio of write tests to read tests
#>
Param (
[Parameter(Position=0,mandatory=$true)][int]$d,
[Parameter(Position=2,mandatory=$true)][string]$path, # C:ClusterStorageCSV01IO.dat
[int]$rw_flag = 0,
[int]$g_min = 0,
[int]$g_max = 8000,
[int]$b = 4096,
[int]$r = 4096,
[int]$o = 32,
[int]$t = 4,
[int]$w = 0)

Function Create-Timespans{
<#
.DESCRIPTION
This function takes the input number of diskspd runs (or duration) and lasts for that input number of seconds while randomizing
the throughput threshold within a specified range. Includes same parameters initially passed in by user.
#>
Param (
[int]$d,
[string]$path,
[int]$g_min,
[int]$g_max,
[int]$b,
[int]$r,
[int]$o,
[int]$t,
[int]$w,
[int]$rw_flag
)



[xml]$xml=@"
<Profile>
<Progress>0</Progress>
<ResultFormat>xml</ResultFormat>
<Verbose>false</Verbose>
<TimeSpans>
<TimeSpan>
<CompletionRoutines>false</CompletionRoutines>
<MeasureLatency>true</MeasureLatency>
<CalculateIopsStdDev>true</CalculateIopsStdDev>
<DisableAffinity>false</DisableAffinity>
<Duration>1</Duration>
<Warmup>0</Warmup>
<Cooldown>0</Cooldown>
<ThreadCount>0</ThreadCount>
<RequestCount>0</RequestCount>
<IoBucketDuration>1000</IoBucketDuration>
<RandSeed>0</RandSeed>
<Targets>
<Target>
<Path>$path</Path>
<BlockSize>$b</BlockSize>
<BaseFileOffset>0</BaseFileOffset>
<SequentialScan>false</SequentialScan>
<RandomAccess>false</RandomAccess>
<TemporaryFile>false</TemporaryFile>
<UseLargePages>false</UseLargePages>
<DisableOSCache>true</DisableOSCache>
<WriteThrough>true</WriteThrough>
<WriteBufferContent>
<Pattern>sequential</Pattern>
</WriteBufferContent>
<ParallelAsyncIO>false</ParallelAsyncIO>
<FileSize>1073741824</FileSize>
<Random>$r</Random>
<ThreadStride>0</ThreadStride>
<MaxFileSize>0</MaxFileSize>
<RequestCount>$o</RequestCount>
<WriteRatio>$w</WriteRatio>
<Throughput>0</Throughput>
<ThreadsPerFile>$t</ThreadsPerFile>
<IOPriority>3</IOPriority>
<Weight>1</Weight>
</Target>
</Targets>
</TimeSpan>
</TimeSpans>
</Profile>
"@


# 1 flag means that the user wishes to randomize the rw ratio
# 0 flag means that the user wishes to control the rw ratio
# Basically, throw an error when the flag is no 0 or 1
if ( ($rw_flag -ne 1) -and ($rw_flag -ne 0) ){
throw "Invalid rw_flag value. Please choose 0 to provide your own rw ratio, or 1 to randomize the rw ratio.
"
}

$path = Get-Location
# loop up until the number of runs (duration) and add new timespan elements
for($i = 1; $i -lt $d; $i++){

$g_param = Get-Random -Minimum $g_min -Maximum $g_max
$true_w = Get-Random -Minimum 0 -Maximum 100

# if there is only one timespan, add another
if ($xml.Profile.Timespans.ChildNodes.Count -eq 1){

# clone the current timespan element, modify it, and append it as a child
$new_t = $xml.Profile.Timespans.Timespan.Clone()
$new_t.Targets.Target.Throughput = "$g_param"
if ($rw_flag -eq 1){
$new_t.Targets.Target.WriteRatio = "$true_w"
}
$null = $xml.Profile.Timespans.AppendChild($new_t)

}
else{

# clone the current timespan element, modify it, and append it as a child
$new_t = $xml.Profile.Timespans.Timespan[1].Clone()
$new_t.Targets.Target.Throughput = "$g_param"
if ($rw_flag -eq 1){
$new_t.Targets.Target.WriteRatio = "$true_w"
}
$null = $xml.Profile.Timespans.AppendChild($new_t)

}
}

# show updated result
$xml.Profile.Timespans.Timespan
# save into xml file
$xml.Save("$pathexpand_profile.xml")

}
#
# SCRIPT BEGINS #
#


# create the xml file with diskspd parameters
Create-Timespans -d $d -g_min $g_min -g_max $g_max -path $path -b $b -r $r -o $o -t $t -w $w -rw_flag $rw_flag


# create path, input file, and node variables
$path = Get-Location
# feed profile xml to DISKSPD with -X parameter (Running DISKSPD)
Invoke-Expression ".diskspd.exe -X'$pathexpand_profile.xml' > output.xml"

$file = [xml] (Get-Content "$pathoutput.xml")


$nodelist = $file.SelectNodes("/Results/TimeSpan/Iops/Bucket")
$ms = $nodelist.getAttribute("SampleMillisecond")

# store the bucket objects into a variable
$buckets = $file.Results.TimeSpan.Iops.Bucket

# change the millisecond values to seconds
$time_arr = 1..$d
foreach ($t in $time_arr){
$buckets[$t-1].SampleMillisecond = "$t"
}

# select the objects you want in the csv file
$nodelist |
Select-Object @{n='Time (s)';e={[int]$_.SampleMillisecond}},
@{n='Total IOs';e={[int]$_.Total}} |
Export-Csv "$pathiops_stat_seconds.csv" -NoTypeInformation -Encoding UTF8 -Force # Have to force encoding to be UTF8 or data is in one column (UCS-2)

# import modified csv once more
$fileContent = Import-csv "$pathiops_stat_seconds.csv"

# if duration is less than 7 (number of percentile ranks), then add empty rows to fill that gap
if ($d -lt 7 ) {
for($i=$d; $i -lt 7; $i++) {
# add new row of values that are empty
$newRow = New-Object PsObject -Property @{ "Time (s)" = '' }
$fileContent += $newRow
}
}

# show output in the terminal
$fileContent | Format-Table -AutoSize

# export to a final csv file
$fileContent | Export-Csv "$pathiops_stat_seconds.csv" -NoTypeInformation -Encoding UTF8 -Force

 


*Script 2: get_iops_percentiles*

# Written by Jason Yi, PM

Function Get-IopsPercentiles{
<#
.DESCRIPTION
This function expects an array of sorted iops, length of the iops array, and an array of percentiles. For the given array of percentiles,
it returns the calculated percentile value for the set of iops numbers.

.PARAMETER sort_iops
array of sorted iops values from the input file
.PARAMETER iops_len
length of the sort_iops array
.PARAMETER percentiles
array of the percentiles you wish to find
#>
Param (
[array]$sort_iops,
[int]$iops_len,
[array]$percentiles)

$new_iops = New-Object System.Collections.ArrayList($null)
# loop through the percentiles array
foreach ($k in $percentiles) {

[Double]$num = ($iops_len - 1) * $k + 1

# if num is equal to 1 then add the first element to array
if ($num -eq 1) {

[void]$new_iops.Add( $sort_iops[0])
}

# if num is equal to the length of array then add the last element to array
elseif ($num -eq $iops_len) {
[void]$new_iops.Add( $sort_iops[$iops_len-1])
}

else {
$val = [Math]::Floor($Num)

#get decimal portion of the num
[Double]$dec = $num - $val

[void]$new_iops.Add( $sort_iops[$val - 1] + $dec * ($sort_iops[$val] - $sort_iops[$val - 1]))
}

}
return $new_iops

}


# Set path and import the csv file
$path = Get-Location
$file = Import-Csv "$pathiops_stat_seconds.csv"

#$sort_iops = $file."Total IOPS" | Sort-Object -Property {$_ -as [decimal]}


# sort the values in IOPS column in ascending order
$sort_iops = [decimal[]] $file."Total IOs"
[Array]::Sort($sort_iops)

# remove the empty or 0 values
$sort_iops = @($sort_iops) -ne '0'

$iops_len = $sort_iops.Length
#$percentiles = (1,25,50,75,90,95,99)
$percentiles = (.01,.25,.50,.75,.90,.95,.99)

# find the calculated percentiles and put them in an array
$new_iops = Get-IopsPercentiles $sort_iops $iops_len $percentiles

# if the old iops length is less than the length of the new calculated iops scores, then that new length is the iops_len
$new_iops_len = $new_iops.Length
if($iops_len -le $new_iops_len){
$iops_len = $new_iops_len
}


# loop through all the CSV rows and insert 2 new columns for the percentile rank and scores
for ($i = 0; $i -lt $iops_len; $i++) {
$value = if ($i -lt $percentiles.Count) { $percentiles[$i] } else { $null }
$file[$i] | Add-Member -MemberType NoteProperty -Name "Percentile Rank" -Value $value

$value2 = if ($i -lt $percentiles.Count) { $new_iops[$i] } else { $null }
$file[$i] | Add-Member -MemberType NoteProperty -Name "IOPS %-tile Score" -Value $value2

}

# Show output to terminal
$file | Format-Table -AutoSize

# Export to a new CSV file
$file | Export-Csv -Path "$pathiops_percentiles.csv" -NoTypeInformation -Force

 

Use Helm Charts from Windows Client Machine to Deploy SQL Server 2019 Containers on Kubernetes

This article is contributed. See the original author and article here.

Helm is the package manager for Kubernetes itself. Learn with Amit Khandelwal on Data Exposed how you can use Helm from your Windows machine to deploy SQL Server 2019 containers on Kubernetes all in less than 5 minutes.


 


Watch on Data Exposed



Resources:

Deploy SQL Server on Azure Kubernetes Service cluster via Helm Charts – on a windows client machine



 


View/share our latest episodes on Channel 9 and YouTube!

Microsoft Federal Collaboration and Cybersecurity Summit

Microsoft Federal Collaboration and Cybersecurity Summit

This article is contributed. See the original author and article here.

reg is open.jpg


 


 


Here at Microsoft, our mission is to empower every person on the planet to achieve more.

Microsoft Federal shares that commitment to further our government customers’ digital transformation, innovation, and secure government collaboration.

Please  join us  next Tuesday for our  Federal Collaboration and Cybersecurity Summit a half-day virtual event at no additional cost designed to advance U.S. Federal agencies collaboration and cybersecurity initiatives. 


Microsoft is bringing together executives and leaders from U.S. Federal agencies to deliver key insights, lessons learned, and practical guidance on:


 



  • Advancing Cybersecurity in the Federal Government

  • Cultural transformations that drive new ways of working and digital modernization.

  • Breaking down silos to facilitate partnership with industry and academia.

  • Connecting with people and information from the office or in the field to securely share and protect sensitive information.


In the face of unprecedented challenges today, leadership resiliency is paramount.  The high stakes of cybersecurity challenges continue to increase and evolve with no end in sight.  The frequency of cybersecurity threats and their level of sophistication have and will continue to grow and as the threat of cyber-breaches increase, so does the need for intergovernmental collaboration, communications, and data sharing.


 


Click HERE to register today and learn more.


 

Azure Marketplace new offers – Volume 130

Azure Marketplace new offers – Volume 130

This article is contributed. See the original author and article here.











We continue to expand the Azure Marketplace ecosystem. For this volume, 86 new offers successfully met the onboarding criteria and went live. See details of the new offers below:





































































































































































































































































































































































 


Applications


 


uiCOCKPIT.png

[ui!] COCKPIT: Urban Software Institute’s [ui!] COCKPIT enables visualization of complex data from a cloud-based platform, such as [ui!] UrbanPulse. Choose from different visualizations, providing general information for the public, management decision aids, and customized applications for specific subjects.


AdstraConsumerEssentials.png

Adstra Consumer Essentials: Adstra Consumer Essentials provides a comprehensive data set of more than 230 million US-based individuals, including data elements commonly used by marketers and advertisers. The proprietary data set is drawn from various sources including public records and a leading global risk/fraud prevention provider.


AITRICS.png

AITRICS: VitalCare from AITRICS is a risk-prediction system built on Microsoft AI services. VitalCare directly collects patient data, such as vital signs and lab tests, from electronic medical records and generates prediction scores for clinical deterioration and sepsis.


AlefPlatform.png

Alef Platform: Alef Education’s platform provides data analytics to help teachers focus on where students are in their mastery of a subject. Alef provides experiential learning that enables students to apply and transfer their newly acquired skills.


AlgoSupplyChainAnalyticsCollaborativePlatform.png

Algo Supply Chain Analytics Collaborative Platform: Algo’s advanced analytics solutions help companies operate highly efficient supply chains by using AI and deep learning to maximize revenue and profit while optimizing inventory spending. Business users can interact with Algo using chat functionality through platforms such as Microsoft Teams.


ApacheWebServerwithDebian10.png

Apache Web Server with Debian 10: Cognosys provides this ready-to-run image containing Apache HTTP Server 2.4.38 installed on Debian 10 Linux. Apache includes software to handle multi-processing modes and support for SSL v3 and TLS via mod_ssl.


Apifon-Multi-channelBusinessMessagingPlatform.png

Apifon – Multi-channel Business Messaging Platform: With Apifon’s messaging platform, you can engage customers through their favorite channels, track the performance of your campaigns, and turn data into KPIs that help you increase your ROI.


atmaioConnectedProductCloud.png

atma.io Connected Product Cloud: Avery Dennison’s atma.io platform creates, manages, and assigns digital identities to products, enabling end-to-end transparency for tracking, storing, and managing events for individual products from source to consumer.


AvnetIoTConnectandSmartFactory.png

Avnet IoT Connect and Smart Factory: Built on IoTConnect and Microsoft Azure, Avnet’s Smart Factory solution helps you monitor and track the production and performance on your factory floor. Gain real-time insights for all locations and integrate your data with supply chain management systems.


AwarenessPlatform.png

Awareness Platform: This solution from i5 B. V. provides ready-to-go professional learning focused on security and privacy to reduce risky behavior by your employees. With Awareness Platform, you can customize courses with a few clicks to match your organization’s policies.


BoxOpsPlatform.png

BoxOps Platform: BoxBoat’s BoxOps is a DevSecOps service solution designed for software teams, enterprise operations, and IT staff who want to accelerate their end-to-end management of app deployment.


ChatbotSmartRH.png

Chatbot Smart RH: SMART RH from Alexys Solutions is an AI-powered chatbot designed to serve internal collaborators seeking HR assistance for leave requests, work certifications, and more. Automate HR requests and free employee time to concentrate on high-value work.


CloudCover365Exchangebackup.png

CloudCover 365: Exchange Backup: CloudCover 365 from virtualDCS lets you back up and restore Exchange Online data, including email, calendars, contacts, and more. The browser-based portal integrates with Veeam Backup 365 and Azure Active Directory.


OneDriveforBusinessCloudBackup.png

CloudCover 365:OneDrive for Business Backup: Back up OneDrive for Business data through a browser-based portal with CloudCover 365 from virtualDCS. CloudCover 365 integrates with Veeam Backup 365 and Azure Active Directory.


CompleteCloudBackupforMicrosoft365.png

Complete Cloud Backup for Microsoft 365: Implement CloudCover 365 from virtualDCS to back up and restore Microsoft 365 data, customize retention plans, schedule backups, and more. The browser-based portal integrates with Veeam Backup 365 and Azure Active Directory.


COMtracInvestigationandBriefManagementSolution.png

COMtrac Investigation & Brief Management Solution: COMtrac provides a consistent approach to managing investigations. The COMtrac platform is a management solution for cases, evidence, and briefs that can be used for all types of investigations by private sector clients and government entities.


ConnectedHeavyMachinery.png

Connected Heavy Machinery: Improve operational safety and utilization of your plants with Equiprise’s cloud-based monitoring solution built on IoT technology. Connected Heavy Machinery connects your equipment and provides you with key performance data.


CRMSensor.png

CRMSensor: Designed for retail chains, banks, healthcare providers, and convenience stores, CRMSensor is an Azure-based system that enables you to communicate interactively with customers. The solution includes an app for Android tablets and customized CRMSensor devices.


DataInsights.png

Data Insights: The oh22 Data Insights solution provides consulting, development, and implementation of a custom enterprise data solution based on Microsoft Azure Synapse Analytics, Azure Data Lake, and Azure Data Factory.


DigitalCustomerExperience.png

Digital Customer Experience: The EY Global Digital Customer Experience solution utilizes Microsoft Dynamics 365 along with an innovative array of EY tools and services, from UX to market research and content writing. Respond to digital change, cut costs, and make your organization fit for growth.


DigitalProcessIntegrationPlatform.png

Digital Process Integration Platform: PlanB. GmbH provides universal microservices for integration of your cloud-based digital services and applications. The PlanB. platform simplifies API management and integrates with on-premises systems, including ERP, CRM, project portfolio management, and manufacturing execution systems.


DigitalSalesServices.png

Digital Sales Services: Softtek enables digital sales from demand generation to e-commerce. Built on Microsoft Azure, Power BI, and Azure-based services, Digital Sales Services enables logistics, last-mile delivery, payments, and analytics.


DNAZ-DigitalBankingShrink-wrapped.png

DNA Z – Digital Banking Shrink-wrapped: DNA Z is an end-to-end digital banking solution for new or existing banks that is deployable on Microsoft Azure. The system includes a blueprint for bank policies and frameworks, fully mapped journeys, operating processes, mobile apps, and data analytics.


DockerCEwithDebian10.png

Docker CE with Debian 10: Cognosys has configured this ready-to-run image of Docker CE 20.10.4 on Debian 10 Linux. Docker Community Server is designed for developers and small teams looking to start with Docker and container-based apps. The image includes built-in orchestration, networking, and security.


EskerOrderManagementAutomation.png

Esker Order Management Automation: Order Management from Esker SA uses AI and robotic process automation to increase the efficiency of sales order processing. Customer service teams can electronically process and track faxes, emails, and orders with improved monitoring and accuracy.


ExperianOpenDataPlatform.png

Experian Open Data Platform: The Open Data Platform (ODP) gives you instant access to a customer’s financial information via Experian’s consumer and business credit information. You can easily create a picture of customer financial well-being to deliver new products and services.


GitlabCommunityEditionWithDebian10.png

GitLab Community Edition with Debian 10: Cognosys has pre-configured this ready-to-run image containing GitLab 13.9.1 on Debian 10 Linux. GitLab is a fast DevOps tool that provides a web-based method for managing Git repositories. GitLab includes wikis, issue tracking, and CI/CD pipelines.


GrafanawithDebian10.png

Grafana with Debian 10: Cognosys has pre-configured this ready-to-run image containing Grafana 7.4.3 on Debian 10 Linux. Grafana is a multi-platform, open-source web application providing analytics and interactive visualizations.


GrafanawithUbuntu1804LTS.png

Grafana with Ubuntu 18.04 LTS: Cognosys has pre-configured this ready-to-run image containing Grafana 7.4.3 on Ubuntu 18.04 LTS. Grafana is a multi-platform, open-source web application providing analytics and interactive visualizations.


GrafanawithUbuntu2004LTS.png

Grafana with Ubuntu 20.04 LTS: Cognosys has pre-configured this ready-to-run image containing Grafana 7.4.3 on Ubuntu 20.04 LTS. Grafana is a multi-platform, open-source web application providing analytics and interactive visualizations.


Haproxy18withDebian10.png

HAProxy 1.8 with Debian 10: Cognosys has pre-configured this ready-to-run image containing HAProxy 1.8.19 on Debian 10 Linux. HAProxy is an open-source, high-availability server that provides TCP/HTTP load balancing and proxying.


IBMWebSphereProductFamilyonAzureOverview.png

IBM WebSphere Product Family on Azure Overview: The IBM WebSphere product family is a suite of enterprise Java application servers that enable enterprise Java workloads on Microsoft Azure. These servers run on Microsoft Azure Red Hat OpenShift, Azure Kubernetes Service, and VMs.


IntelligentDataPlatform.png

Intelligent Data Platform: Powered by Microsoft Azure, the EY Intelligent Data Platform is a scalable solution to optimize data in real-time, generate rapid insights, enhance decision-making, and deliver greater business value. The platform supports risk management, regulatory reporting, governance, and more.


ioMoVo.png

ioMoVo: ioMoVo offers you a range of storage, data exchange, and multimedia management options for cloud or on-premises storage. This solution from Practical Solutions Inc. provides secure access to your data and lets you interconnect multiple storage platforms.


ioMoVoS.png

ioMoVoS: An add-in for the Practical Solutions Inc. ioMoVo platform, ioMoVoS provides media services such as video indexing, analysis of media with machine learning, publication to external video platforms, and more.


IoTAmbientConditionsIntelligentService.png

IoT Ambient Conditions Intelligent Service: IoT Ambient Conditions Intelligence Service helps data center operators, manufacturers, and plant operators improve their performance and reduce costs by improving the operational ambient conditions and reducing equipment maintenance.


JenkinsWithDebian10.png

Jenkins with Debian 10: Cognosys has pre-configured this ready-to-run image containing Jenkins 2.263.4 on Debian 10 Linux. Jenkins is a Java-based open-source tool providing continuous integration services for software development.


KeyScalerforAzureSphere.png

KeyScaler for Azure Sphere: Device Authority provides Sphere Security Automation powered by Keyscaler to enable end-to-end service offerings with enhanced security on Microsoft Azure Sphere.


LAMPWithDebian10.png

LAMP with Debian 10: Cognosys has pre-configured this ready-to-run image containing a LAMP (Linux Apache MySQL PHP) stack on Debian 10 Linux. This image has been designed for enterprise customers who want to deploy a secure LAMP server. This image contains Apache HTTP Server 2.4.38, PHP 7.3, and MySQL Server 8.0.23.


MicrosoftTeamsVoIPCallingSolutions.png

Microsoft Teams VoIP Calling Solutions: Add a virtualDCS calling plan to extend your Microsoft Teams solution by enabling VoIP calling to non-Teams devices and telephones. virtualDCS offers a range of telephony services that integrate with Teams to meet your business requirements.


ModernWorkplace.png

Modern Workplace: The EY Modern Workplace services provide integrated and secure solutions for collaboration built on Microsoft 365, Windows 10, and enterprise mobility. With EY, you can be confident of having the right strategy, technology, capabilities, and governance to fuel and sustain your work.


MozzazDigitalHealthPlatformSaaS.png

Mozzaz Digital Health Platform (SaaS): Mozzaz is a digital health technology company that specializes in interactive solutions for remote patient monitoring, active engagement, and virtual telehealth. The Mozzaz platform provides over 200 digital solution libraries based on clinically proven interventions.


NetFoundryEdgeRouter.png

NetFoundry Edge Router: NetFoundry Edge Routers provide zero trust connectivity between Microsoft Azure and any site, edge device, private/public clouds, and hybrid applications. Create orchestrated networks delivered as a service to replace VPNs and SD-WAN.


Nextcloud-Theself-hostedproductivityplatform.png

Nextcloud – The self-hosted productivity platform: Linnovate offers this self-hosted instance of Nextcloud Flow, enabling users to quickly and securely share files and folders. Nextcloud Flow features file access control, encryption, authentication, and ransomware recovery capabilities.


OnlineCloudBackupforSharePoint.png

Online Cloud Backup for SharePoint: Back up SharePoint data through a browser-based portal with CloudCover 365 from virtualDCS. CloudCover 365 integrates with Veeam Backup 365 and Azure Active Directory.


PachydermEnterprise.png

Pachyderm Enterprise: Pachyderm is an enterprise-grade data science platform built on Kubernetes. Deploy a Pachyderm cluster on Microsoft Azure and deploy automated machine learning workflows at scale.


PCGAnalytics.png

PCG Analytics: This service enables strategic decision-making and reporting for stakeholders inside and outside of a university. Built on Microsoft Power BI, PCG Analytics integrates with external data sources, provides role-based dashboards, and delivers comprehensive data analysis for non-technical users.


ProjecttoPlannerSync-SaaS.png

Project to Planner Sync – SaaS: PPM Works’ Microsoft Project and Planner Sync enables two-way task synchronization between Microsoft Project Online and Microsoft Planner. Give your executives the visibility they seek with this powerful tool.


PublicFinanceManager.png

Public Finance Manager: Public Finance Manager (PFM) is a blockchain solution that addresses long-standing issues challenging public finance management. PFM integrates with existing ERP systems and facilitates viewing and reconciliation of appropriation and management frameworks.


Python3withDebian10.png

Python 3 with Debian 10: Cognosys has pre-configured this ready-to-run image containing Python 3.7.3 on Debian 10 Linux. Python is an open-source programming language with support for object-oriented programming, dynamic typing, and dynamic binding.


QStockWarehouseManagementandOrderManagement.png

QStock Warehouse Management & Order Management: The QStock warehouse management solution runs on Microsoft Azure and integrates in real time with Sage Intacct. QStock offers inventory control, integrated shipping, lot and serial tracking, e-commerce support, commercial invoices, and more.


Restaurantintra.png

Restaurantintra: Restaurantintra is a SaaS-based sales reporting solution for restaurants. The software provides mobile-friendly interactions, support for multiple restaurants, sales analysis, reporting, and budgeting. This software is available in Finnish and English.


RiskIntegrityIFRS17.png

RiskIntegrity IFRS 17: RiskIntegrity helps insurers of any size transition from legacy accounting frameworks to the IFRS 17 standard. The solution integrates with existing infrastructure and supports credit insurers, reinsurers, life insurers, and non-life insurers.


RiskIntegrityLDTI.png

RiskIntegrity LDTI: RiskIntegrity helps insurers of any size transition from legacy accounting frameworks to the Long-Duration Targeted Improvements (LDTI) accounting requirements. The solution integrates with existing infrastructure and supports credit insurers, reinsurers, life insurers, and non-life insurers.


RockyDEM44.png

Rocky DEM 4.4: CrunchYard’s Rocky DEM 4.4 System is a Microsoft Azure-based VM that provides a suitable environment for users to run Rocky DEM simulations with single or multiple Nvidia GPUs. Rocky is installed and configured on the chosen VM along with Nvidia CUDA drivers.


SimplificaCI.png

SimplificaCI: The SimplificaCI platform helps organizations facilitate internal communications across multiple channels, making your company more productive and profitable. The solution integrates with desktop, mobile, calendar, and email communications. This solution is available only in Portuguese.


SkyHiveEnterprise.png

SkyHive Enterprise: SkyHive Enterprise drives rapid workforce transformation by delivering real-time, skill-level insights into internal workforces and external labor markets, identifying future skills, and facilitating individual-and company-level reskilling.


UnionBenefitandProjectTimesheetTracker.png

Union Benefit and Project Timesheet Tracker: Simplify your union payroll with the Data Pros Timesheet app, built on Microsoft SharePoint and the Microsoft Power Platform. This automation software integrates with popular payroll systems and calculates union benefit payments, insurance, USL&H, and more.


UtilityWave.png

UtilityWave: UtilityWave delivers the required capabilities to tackle the challenges of multiple legacy systems, IoT devices, and a dynamic energy grid. UtilityWave utilizes Microsoft Azure to provide a scalable platform on which utilities can build digital energy services.


VeritasAPTAREITAnalytics.png

Veritas APTARE IT Analytics: Quickly deploy Veritas APTARE IT Analytics for reporting insights into your hybrid cloud storage environment. This BYOL version provides the visibility enterprises need to identify underutilized IT resources they can repurpose to achieve significant cost savings.


VolunteerManagementSystem.png

Volunteer Management System: Web Synergies’ iVolunteer is an end-to-end volunteer management system that is designed to help not-for-profit organizations increase efficiency, reduce costs, expand community outreach, and enable effective fundraising.


WordpressWithDebian10.png

WordPress with Debian 10: Cognosys has pre-configured this ready-to-run image featuring WordPress 5.6.2 on Debian 10 Linux. WordPress is an open-source CMS that provides a templating system for content publication. This image includes MySQL Server 8.0.23, Apache HTTP Server 2.4.38, and PHP 7.3.



 


Consulting services


 


1-DaySmartMaintenanceEnvisioningWorkshop.png

1-Day Smart Maintenance Envisioning Workshop: HSO will guide you on the journey from preventive maintenance to predictive maintenance by using Microsoft Azure AI. After reviewing your business objectives, HSO consultants will brainstorm solutions to define the strategy needed to drive your desired business outcomes.


AdvancedAnalyticsDiscovery10-WeekWorkshop.png

Advanced Analytics Discovery: 10-Week Workshop: The Advanced Analytics Discovery program from Peak Indicators will architect and deliver a blueprint for your organization to deploy a solution on Microsoft Azure using services such as Azure Machine Learning, Azure Databricks, and Azure Synapse Analytics.


AIandAdvancedAnalyticsServices10-WeekProofofConcept.png

AI & Advanced Analytics Services: 10-Week Proof of Concept: Tiger Analytics will help you drive planning and optimization of brand investments to improve sales, customer acquisition, customer insights, product analytics and more. The data engineering service includes the design and development of an ETL pipeline using Azure Machine Learning services.


AzureAdvancedAnalytics10-WeekImplementation.png

Azure Advanced Analytics: 10-Week Implementation: Peak Indicators will work closely with your data science teams to deliver a pilot analytics solution built on Microsoft Azure. The engagement will focus on a use case defined with your stakeholders, development of a solution, and deployment of data science experiments and models.


AzureAppModernization2-WeekImplementation.png

Azure App Modernization: 2-Week Implementation: Softlanding’s engagement covers the benefits of Microsoft Azure and highlights Azure services that will help you modernize your applications. This offer includes guidance and deployment assistance for your developers to update on application to use Azure.


AzureApplicationMigration1-WeekAssessment.png

Azure Application Migration: 1-Week Assessment: PetaBytz’s cloud migration team will help your business get started using Microsoft Azure or optimize your current implementation. The service includes guidance on infrastructure, migration strategy for apps, and a high-level roadmap for migration planning.


AzureAutomation4-HourAssessment.png

Azure Automation: 4-Hour Assessment: In this free assessment, akquinet AG will explore the possibilities for you to automate tasks using automation tools on Microsoft Azure. This service is available for either an existing Azure tenant or a planned environment.


AzureMigration10-WeekImplementation.png

Azure Migration: 10-Week Implementation: Cybercom Group’s Cloud Migration Practice will onboard you and your applications on Microsoft Azure to enable further growth. Cybercom will migrate and modernize your digital estate.


AzureSentinel2-WeekImplementationandMaintenance.png

Azure Sentinel: 2-Week Implementation & Maintenance: Softlanding will provide you with a high-level view of your security infrastructure by deploying Microsoft Azure Sentinel, hardening your Microsoft 365 environment, and configuring baseline security reports.


AzureSynapseAnalytics5-DayImplementation.png

Azure Synapse Analytics: 5-Day Implementation: Softlanding will provide you with a strong foundation to analyze big data using Microsoft Azure Synapse Analytics and create reports built on Microsoft Power BI. This service includes data ingestion, design of data lake and data warehouse, and data cleansing.


AzureWindowsVirtualDesktop6-WeekProofofConcept.png

Azure Windows Virtual Desktop: 6-Week Proof of Concept: Stay ahead of the curve by utilizing Practical Solutions Inc.’s professional services to quickly unlock the full scope of Windows Virtual Desktop on Microsoft Azure. Practical Solutions will develop a conceptual proof of concept and deliver a roadmap for deployment.


BuildUpwithAzure-AssessmentandPropositions5-Day.png

Build Up with Azure: 5-Day Assessment & Propositions: Indacon offers a remote engagement to build up or integrate your solutions on Microsoft Azure. Indacon will identify how you can migrate or optimize environments and will define a roadmap to provide you with immediate benefits in cost, performance, and security.


CloudAdoptionFramework6-WeekImplementation.png

Cloud Adoption Framework: 6-Week Implementation: Practical Solutions Inc. (PSI) will highlight the best practices, key value, and benefits of Microsoft Azure cloud services. PSI will walk you through the Microsoft Cloud Adoption Framework, guide you through adoption, and identify key cost-saving opportunities.


CloudServicesforAzureLighthouse.png

Cloud Services for Azure Lighthouse: Practical Solutions Inc. (PSI) will support your Azure-based cloud services using Microsoft Azure Lighthouse. With Azure Lighthouse, you maintain control of your Azure tenant while PSI has the access required to support you.


ContainerswithOpenShiftonAzureImplementation.png

Containers with OpenShift on Azure: Implementation: Uni Systems provides consulting and assistance for you transition to a container-based architecture for DevOps using Red Hat OpenShift on Microsoft Azure. The engagement includes assistance in establishing DevOps practices, configuring CI/CD pipelines, cluster optimization, and more.


DataGovernance10-WeekImplementation.png

Data Governance: 10-Week Implementation: Exelegent will implement security and information governance capabilities in your healthcare organization by using Microsoft Azure Information Protection, cybersecurity frameworks, and industry best practices.


GitHubandAzureDevOps2-DayWorkshop.png

GitHub and Azure DevOps: 2-Day Workshop: Brainscale will highlight features of GitHub and Microsoft Azure DevOps to help participants decide which developer collaboration platform suits their needs. This workshop includes an overview of DevOps fundamentals and industry practices, as well guidance on migrating from older source control platforms.


MigratetoAzure4-WeekImplementation.png

Migrate to Azure: 4-Week Implementation: Foghorn Consulting experts will help you migrate to Microsoft Azure and manage your cloud operations. Foghorn provides expertise in cloud engineering, site reliability, performance optimization, and other services to improve your ROI and accelerate DevOps efforts.


MOQdigitalAzureMigration2-WeekImplementation.png

MOQdigital Azure Migration: 2-Week Implementation: MOQdigital will migrate your virtual machines to Microsoft Azure IaaS. This service is aimed at customers who want to migrate workloads in a secure manner and establish a repeatable process for server migration using Microsoft best practices.


MphasisEONQuantumComputing5-DayAssessment.png

Mphasis EON Quantum Computing: 5-Day Assessment: Mphasis’s assessment helps enterprises perform a structured analysis to determine if using quantum computing is a relevant approach for solving your specific business problem. Mphasis will evaluate software, hardware, and algorithm requirements for you.


MphasisEONQuantumComputing5-DayWorkshop.png

Mphasis EON Quantum Computing: 5-Day Workshop: Mphasis’s hands-on workshop helps enterprises create a roadmap for using quantum computing to solve business problems in machine learning, optimization, and simulation.


MphasisEONQuantumComputing6-WeekProofofConcept.png

Mphasis EON Quantum Computing: 6-Week Proof of Concept: Mphasis will create a proof of concept to establish a business case for a quantum computing solution to solve your critical business problem. This offer is led by Mphasis’s team of experts in quantum computing, data science, and Microsoft Azure.


SmartMeterAnalytics8-WeekImplementation.png

Smart Meter Analytics: 8-Week Implementation: Neudesic will process, validate, and prepare smart meter data for visualization and analysis on a hybrid cloud architecture that utilizes on-premises Microsoft SQL Server and Microsoft Power BI with Microsoft Azure HDInsight.


VOIPNETWORKSCLOUD9PROMO.png

VoIP Networks Cloud9 Promotion: VoIP Networks will act as your one-stop vendor for all facets of your telephony and networking technologies. This offer includes a central point of contact for all common carriers to maintain existing services or coordinate activation of new ones.