Hot rental car market = scams

Hot rental car market = scams

This article was originally posted by the FTC. See the original article here.

Rental car scams: Renting a car this summer? Only scammers will ask you to pay with a gift card. #Scam Free Summer. ReportFraud.ftc.gov. Image of red car and mobile phone with image of gift card on screen.

The weather is getting warmer, and you might be itching to travel again. The mountains, the beach, and the trails are calling you — and everyone else. At least that’s what it feels like when you start looking into renting a car. With rental car availability at an all-time low, prices are sky high. So, if you suddenly find an available car at a cheap price, you might be dealing with scammers looking to cash in on the rental car shortage.

Scammers are posing as rental car companies, setting up their own websites, and advertising fake customer service phone numbers, all to convince travelers they’re legit. Then, they’re asking people to pre-pay for the rental — with a gift card or prepaid debit card. To avoid rental car scammers driving off with your money:

  • Research the rental car company by searching for the name of the company and words like “scam,” “complaint,” or “review” to check if other people have had a bad experience.
  • Verify deals with the company directly. If you need customer support, look for contact info on the company’s official website. Don’t use a search engine result. Scammers can pay to place sponsored ads in search results, so they show up at the top or in the sponsored ad section.
  • Pay with a credit card if possible, and never pay with a gift card or prepaid debit card. You can dispute credit card charges, but gift cards and prepaid debit cards can disappear like cash. Once you give the number and PIN to a scammer, the money is gone.

Before you rush to book that miraculously available rental car, take a beat and read up about things you should consider when renting a car. If you spot a rental car scam, tell the FTC at ReportFraud.ftc.gov.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Microsoft Teams certified accessory tour

Microsoft Teams certified accessory tour

This article is contributed. See the original author and article here.

Today we’re pleased to announce availability of the latest Microsoft hardware innovations in support of remote and hybrid work at home and in the office.  In the latest episode of Microsoft Mechanics, engineering leader Branden Powell explains how the Surface design team built devices for all day comfort and mobility, delivering the best experience on Microsoft Teams.


 


mechanics-episode.png


 


Devices now available at your favorite electronics retailer and surface.com include:


 



  • Surface Headphones 2+ for Business

  • Microsoft Wireless Headset

  • Microsoft Modern USB Headset

  • Microsoft Modern USB-C Speaker

  • Microsoft Modern Webcam


 


To learn more, see detailed specs at aka.ms/ModernAccessories.

How Microsoft Viva Can Enhance Your Employee Experience – Webinar

How Microsoft Viva Can Enhance Your Employee Experience – Webinar

This article is contributed. See the original author and article here.

Vivawebinar.pngMicrosoft Viva is an employee experience platform that brings together communications, knowledge, learning, resources, and insights. Powered by Microsoft 365 and experienced primarily through Microsoft Teams, Viva fosters a culture where people and teams are empowered be their best from anywhere.


 


*  Join us Thursday, June 17th, to learn how Microsoft Viva can take your organizational employee experience to the next level as we cover: Download the meeting invite here.


 



  • Viva Connections – Communications and culture

    • Keep everyone connected

      • Encourage meaningful connections across the organization by enabling employees to easily discover relevant communications and communities.



    • Make it easy for people to contribute

      • Foster a culture of inclusion by empowering every employee to contribute ideas and share feedback.



    • Unite and inspire your organization

      • Align the entire organization around your vision, mission, and strategic priorities



    • Viva Topics – Knowledge and expertise

      • Turn content into usable knowledge

        • Use AI to reason over your organization’s content and automatically identify, process, and organize it into easily accessible knowledge



      • Organize knowledge into topic pages

        • Enable your organization’s experts to share and refine knowledge through curated topic pages, automatically generated and updated by AI



      • Make knowledge easy to discover and use

        • Deliver relevant topic cards in the apps people use everyday



      • Viva Learning – Skilling and growth

        • Make learning a natural part of your day

          • Foster a culture of learning by enabling people to easily discover, share, and engage with learning integrated into Microsoft Teams.



        • Make all your learning content available in one place

          • Simplify the learning experience by bringing together world class content from LinkedIn Learning, 3rd parties, Microsoft Learn, and your own content.



        • Drive results that matter

          • Empower your leaders and employees to curate, assign and track learning aligned with business outcomes.



        • Viva Insights – Productivity and wellbeing

          • Deliver personalized and actionable insights

            • Empower individuals, teams, and orgs to achieve balance, build better work habits, and improve business outcomes with personalized insights and recommended actions.​



          • Quantify the impact of work on people and business

            • Gain data-driven, privacy-protected visibility into how work patterns affect wellbeing, productivity, and results. ​



          • Address complex business challenges

            • Use advanced tools and additional data sources to perform deeper analysis, address challenges important to your business, and respond quickly to change.​



          • Wrap Up/Next Steps










*  Join us Thursday, June 17th, to learn how Microsoft Viva can take your organizational employee experience to the next level as we cover: Download the meeting invite here.


 


Thanks for visiting – Michael Gannotti   LinkedIn | Twitter 


Michael GannottiMichael Gannotti

Microsoft 365 PnP Weekly – Episode 131

Microsoft 365 PnP Weekly – Episode 131

This article is contributed. See the original author and article here.

Thumb-Ep131.png


 


 


In this installment of the weekly discussion revolving around the latest news and topics on Microsoft 365, hosts – Vesa Juvonen (Microsoft) | @vesajuvonen, Waldek Mastykarz (Microsoft) | @waldekm are joined by the Senior Program Manager Zhenya Savchenko (Microsoft) from the Developer Division of Visual Studio group. He one of the PMs coordinating new Microsoft Teams Toolkit v2 extension for Visual Studio Code.  Topics discussed in this session – the Visual Studio developer tools group’s role in the development of the Microsoft Teams Toolkit, ideas for spending $5B to make Microsoft products better for developers, next steps for Teams Toolkit, and wrap up with each participant’s focus for the week. 


 


Covering also 22 new articles from Microsoft and the Community from past week!  


 


Please remember to keep on providing us feedback on how we can help on this journey. We always welcome feedback on making the community more inclusive and diverse.


 


 


This episode was recorded on Monday, June 14, 2021.


 



 


These videos and podcasts are published each week and are intended to be roughly 45 – 60 minutes in length.  Please do give us feedback on this video and podcast series and also do let us know if you have done something cool/useful so that we can cover that in the next weekly summary! The easiest way to let us know is to share your work on Twitter and add the hashtag #PnPWeekly. We are always on the lookout for refreshingly new content. “Sharing is caring!” 


 


Here are all the links and people mentioned in this recording. Thanks, everyone for your contributions to the community!


 


Microsoft articles:


 



 


Community articles:


 



 


Additional resources:


 



 


If you’d like to hear from a specific community member in an upcoming recording and/or have specific questions for Microsoft 365 engineering or visitors – please let us know. We will do our best to address your requests or questions.


 


“Sharing is caring!”

How to use Send an HTTP request to SharePoint in Power Automate?

How to use Send an HTTP request to SharePoint in Power Automate?

This article is contributed. See the original author and article here.

Introduction


 


Send an HTTP request to SharePoint action is used to execute REST queries. As we know when we want to perform any operations in SharePoint then we are using APIs so in the flow, we can use fro the same requirements.


 


For more details refer to this.


 


Implementation


 


We will create a SharePoint list and we will perform Create, Read, Update and Delete operations. we will create an instant flow. let’s see step-by-step implementation.


 


 


1. Go to Power Automate > My flows > Click on New flow > Select instant cloud flow


 


STep 1.png


 


 


 


2. Read items from To Do list. (Read Operation)


 


So to perform the read operation we will use the GET method.


 


Add Send an HTTP request to SharePoint action from + icon and here we need Site address, Method, and URI.


 


Site Address: Select the Site URL in which we want to perform actions


Method: GET


Uri: _api/web/lists/getbytitle(‘To Do’)/items


Headers: Not required


 


You can also add URI in a variable because this we will need for all the actions.


 


Read Items.png


 


If it will successfully execute it returns statusCode 200 and records in the body if records are available.


 


Read Items OP.png


 


 


3. Create item in To Do list (Create Operation)


 


Site Address: Select the Site URL in which we want to perform actions


Method: POST


Uri: _api/web/lists/getbytitle(‘To Do’)/items


Headers: Need JSON object


  {
       “content-type”: “application/json;odata=verbose”,
       “accept”: “application/json;odata=verbose”
  }


Body : RequestBody


     {
         “__metadata”: {
               “type”: “SP.Data.To_x0020_DoListItem”
        },
       “Title”: “Demo Task1”,
       “Status”: “Started”
   }


 


In the request body, we need type so now the question is how to get type? It is SP.Data.{ListName}ListItem (replace {ListName} with list name. if space in between list name then it will be separated with _x0020_).


 


Create item.png


 


If it will successfully execute it returns statusCode 201.


 


Create item op.png


 


 


 


4. Update item in To Do list (Update Operation)


 


Site Address: Select the Site URL in which we want to perform actions


Method: PATCH


Uri: _api/web/lists/getbytitle(‘To Do’)/items(itemID) 


Headers


  {
      “content-type”: “application/json;odata=verbose”,
      “IF-MATCH”: “*”
}


Body : RequestBody


     {
         “__metadata”: {
               “type”: “SP.Data.To_x0020_DoListItem”
        },
       “Title”: “Task1”,
   }


  


Update Item.png


 


 


5. Delete item in To Do list (Delete Operation)


 


Site Address: Select the Site URL in which we want to perform actions


Method: DELETE


Uri: _api/web/lists/getbytitle(‘To Do’)/items(itemID) 


Headers


  {     
      “content-type”: “application/json;odata=verbose”,
      “IF-MATCH”: “*”,
      “X-HTTP-Method”: “DELETE “
}


Body: Not required 


 


Delete ietm.png


Summary


 


In this article, we have seen the step-by-step implementation of CRUD operation of SharePoint list items using Send an HTTP request to SharePoint in Power automate.


 


Hope this helps!


 


Sharing is caring!


 

Run reports on your tenant using the CLI for Microsoft and Azure Container Instances

Run reports on your tenant using the CLI for Microsoft and Azure Container Instances

This article is contributed. See the original author and article here.

Using the CLI for Microsoft 365, you can manage your Microsoft 365 tenant and SharePoint Framework projects on any platform, including running it in a container. By running the CLI for Microsoft 365 in a container you free yourself from installing it locally. Leveraging a Logic App, Azure Container Instances and the CLI for Microsoft 365 you can automate reporting and maintenance of your tenant.


 


The goal is thus to generate some form of reporting on our tenant. There are a lot of sample scripts available, but we will start simple by reporting all deleted Office 365 Groups (this will include deleted Teams). Executing a report like that with the CLI for Microsoft 365 would be just a single line:  


 

m365 aad o365group recyclebinitem list

 


To achieve our reporting scenario we need the a few other components as well:



  1. An Azure resource group to group our resources together 

  2. A managed identity to make sure we can access our tenant securely;

  3. A Logic App to orchestrate everything;

  4. Azure Container Instances to execute our script;

  5. A Git repo that contains the script we want to execute and pull in;


If you are new to running Azure Containers and using Logic Apps there is a great walkthrough on running sentiment analysis with Azure Container Instances. But let us get started with the first step.


Create an Azure Resource Group


To create a new Azure resource group simply log in to the Azure Portal and create a new resource group using the create a resource button. For demo purposes I called my resource group cli-test.


Create a Managed Identity


In our resource group we can create a new user managed identity that we can use. Click on the add button and search for User Assigned Managed Identity to create the new managed identity. Simply provide a name and create your new managed identity. 


 


cli-ma.png


 


 


 


Once the managed identity is created it still need permissions to do anything. Just creating it is not enough. In order to use the managed identity we need to make sure that it has the correct permissions. The CLI for Microsoft 365 docs explain in details what the steps are required. For our reporting scenario’s we need support to query the Graph and SharePoint. You can use the CLI for Microsoft 365 to easily hand out the permissions you need with the following snippet: 


 

m365 aad approleassignment add --displayName "ma-cli-test" --resource "Microsoft Graph" --scope "Sites.Read.All"
m365 aad approleassignment add --displayName "ma-cli-test" --resource "Office 365 SharePoint Online" --scope "Sites.Manage.All"
m365 aad approleassignment add --displayName "ma-cli-test" --resource "Microsoft Graph" --scope "Group.Read.All" 

 


Now all we need is the GUID that identifies the managed identity. Navigate to https://portal.azure.com select Azure Active Directory and pick Enterprise Applications. In the Application Filter filter on Managed Identities and make sure to pick the Application Id of your newly created managed identity. 


 


cli-ma-id.png


Store it somewhere save as we need it at a later moment. 


Create a Logic App


As we need a Logic App to orchestrate the execution of our script the first step is to create one. You can do so in any resouce group, or create a new one. There are no requirements for your Logic App. So we pick a Logic App (Consumption) and provide a resource group and name. Once the Logic App is created navigate to it and click the edit button. Once in edit mode you are free to pick your trigger. You can pick any trigger you like, for demo purposes you could connect it to a list in SharePoint Online, or just use a HTTP trigger. The next step is to create two parameters, containerGroup and containerName. Both will be used more than once so it helps to create them parameters, but it is not required. For demo purposes I added a small snippet as a postfix to identify when it was created: 

formatDateTime(utcNow(),'MM-dd-yyyy')

 This helps when debugging as you can always identify when something was created. The next step is to add a new action of type Create or update a container group. It is a pretty complex action with a lot of parameters to get right, so it is worth to pay close attention when typing:


























































































Option  Value Comments
Subscription id Any subscription you like I suggest putting it all in the same subscription. 
Resource Group Any resource group you like  I suggest putting it all in the same resource group.
Container Group Name

@{variables('containerGroup')}

Parameter for ease of use
Container Group Location Any location is supported  
Container Name – 1

@{variables('containerName')}

Parameter for ease of use
Container Image – 1

m365pnp/cli-microsoft365:latest

 
Container Resource Requests CPU – 1 1 If you are planning to run complex or large scripts it might be worth to add additional resources. 
Container Resource Requests Memory – 1 1.5 If you are planning to run complex or large scripts it might be worth to add additional resources. 
Container Command Segment – 1

bash

You cannot execute multiple commands in a single line. You will need to add each statement to a segment. 
Container Command Segment – 2

/mnt/repo1/test.sh

The value of the mount path and script name are dependant on settings below. 
Container Volume Mount Path – 1 /mnt/repo1  
Container Volume Mount Name – 1 gitrepo The name for the mount path must match the volume name specified later. 
Volume Git Repo Volume Repository – 1 https://github.com/appieschot/cli-test.git Any public repository will do. 
Volume Name – 1 gitrepo  
ContainerGroup Managed Identity User Assigned Identities {
“/subscriptions/[guid]/resourcegroups/[rg-name]/providers/microsoft.managedidentity/userassignedidentities/[identity-name]“: {}
}
Make sure to update the bold parts.
ContainerGroup Restart Policy OnFailure Make sure the container is restarted if there are problems. 

This action will thus create a new container instance, pulls in the CLI for Microsoft 365 image and make sures that a git repo with a script is pulled in. And best of all; the container created will be running with the managed identity you have specified. Now all we need to do is to capture the output of our container instance and clean up. 


 


To do so add a action of type until. Do not configure any requirements yet and simply add an action of type Get properties of a container to retrieve the latest status, use the containerGroup parameter to make sure you get the correct container. Put in the Container Group name as a parameter and check for status Succeeded of the untill action. Add a new condition and check there for the same status. In the true statement we can implement the logic for retrieving our data. You can use the Get logs from a container instance and e-mail the output to yourself. 


cli-la-cleanup.png


In the false branche add an wait action that waits for 10 or 15 seconds. That will make sure it takes a while before it retries checking the status to see if it is finished.  


Create your script


The last part is to make sure that the test.sh file referred to as being mounted in your container actually exists. It should contain all logic to run your script, but the first step is to authenticate. In order to authenticate with the managed identity we need to tell the CLI for Microsoft 365 to do so. By specifying the login using the authType parameter we can pass the GUID of our managed identity. The CLI then knows to use the managed identity of the container it is running in and you are done. 


 

m365 login --authType identity --userName 00000000-0000-0000-0000-000000000000 
m365 aad o365group recyclebinitem list --output json

 


Full Logic App


Once all these components are in place you can simply run any script in the container. Just make sure you have the appropriate permissions assigned to the Manged Identity! 


PnP Recording


If you are looking at a video walkthrough you can checkout the PnP recording. It highlights the results and steps taken to achieve basic reporting on your tenant. 




Lesson Learned #174: Using Synonyms in Azure SQL Managed Instance for Linked Server tables

This article is contributed. See the original author and article here.

Today, I worked on a very interesting case when our customer has a linked server to Azure SQL Database, but, they want to remove the prefix of the linked server that everytime that run a query against this Linked Server, for example, SELECT * FROM MyLinkedServer.DatabaseName.SchemaName.TableName. In this article, I would like to share with an example how to do it. 


 


The first thing that we need is to have a linked server connected to any other Managed Instance, OnPremises or Azure SQL Database. Let’s give the name LinkedServerMI to this linked server. 


 


Once we have this, basically, we need to create a new synonym using the following command: CREATE SYNONYM [dbo].[MyNewTableForLinkedServer] FOR [LinkedServerMI].[DatabaseName].[SchemaName].[TableName]


 


Using this synonym right now, everytime that I execute any command, for example, SELECT * FROM MyNewTableforLinkedServer automatically this command will run a query against Linked Server, database, schema and tablename. 


 


Enjoy!


 


 

Cloud App Security: block TOR Browser (Anonymous IP)

Cloud App Security: block TOR Browser (Anonymous IP)

This article is contributed. See the original author and article here.

Hi all, Alan here again with a new article, I’m a Customer Engineer from Italy on Identity and Security. 


 


I want to show you how to block TOR browser using Cloud App Security, simple and fast! 


 


During the last few months, I had several customers requesting how to block sign-in from anonymous Ip Addresses. One example would be someone using TOR Browser. I started playing around with CAS and finally found a quick solution. Continue reading to find out more.  


 


We will use Azure AD “Conditional Access policy” with Session Control together with “Cloud App Security Conditional Access App Control”. 


 


We will start creating a “special” Azure AD Condition Access policy that will enable the APP in CAS, let’s see how. 


PS: the APP will appear in Cloud App Security Conditional Access App Control directly after the user start authenticating/using it.


 


You will need access to you tenant’s Azure AD (portal.azure.com) and Cloud App Security (mycompany. portal.cloudappsecurity.com). 


 


Thirst thing to do is create an Azure AD Conditional Access policy: 


 


1. Navigate to your Azure Active Directory 


2. Under Manage click on Security 


Immagine1.png


Immagine3.png


3. Click on Conditional Access 


4. Select New Policy 


Immagine4.png


5. Give it a Name 


6. Select to which users it will apply 


7. Select the cloud application, for this demo I will select Office 365 


Immagine5.png


8. Go to Session and select Use Conditional Access App Control 


9. Select Use Custom Policy  


Immagine6.png


10. Click Select 


11. Enable the policy and click Create 


Immagine7.png


 


Once this is done the first time users log in Office 365 suite the application will be integrated in Cloud App Security 


 


Open Cloud App Security portal : https://mycompany. portal.cloudappsecurity.com 


 


On the top right side you have the configuration wheel, click and select “IP Address ranges” as shown below 


Immagine8.png


One interesting thing is that if you filter for one of the following Tags “Tor, Anonymous Proxy or Botnet” you will see it matches the following rule  


Immagine9.png


 


CAS has the “intelligence” to know which are these suspicious IP Addresses or networks 


 


Here some other details Create anomaly detection policies in Cloud App Security | Microsoft Docs 



  • Activity from anonymous IP addresses 

  • Activity from suspicious IP addresses, Botnet C&C 

  • Activity from a TOR IP address 


Back to our Connected Apps: 


 


1. Go to Connected Apps 


Immagine10.png


2. In the central pane you will have three tabs, select “Conditional Access App Control apps”. 


   You will get a list of applications for which you can start creating CAS policies 


Immagine11.png


 


3. Now browse to Control menu and select Policies 


Immagine13.png


4. Select “ + Create policy” 


Immagine14.png


Immagine15.png


 


The important part here is FILTERS and ACTIONS


Immagine16.png


5. Click on “Create” (you will see it listed)


Immagine17.png


 


Access Office portal from the TOR Browser (use a valid user account from your Azure AD)


Immagine18.png


Immagine19.png GREAT!!


 


Hope this article gives some hints on how to use Cloud App Security which I think is a great tool, simple and powerful and can really help enhance your security posture. 


 


Regards 


Alan @CE  


Customer Engineer – Microsoft Italy 


 


Useful Resources: 


Zero Trust Maturity Assessment Tool | Microsoft Security 


 

Update on geospatial functions

Update on geospatial functions

This article is contributed. See the original author and article here.

The geospatial functions in Azure Data Explorer (ADX) received a big update in June 2021. Based on our research and very valuable feedback from our customers we decided to add support for H3 Cell tokens. On top of that we provided new functionality to generate polygons from all of our geospatial clustering functions.


 


H3 Cell token


Geospatial data can be analyzed efficiently using grid systems to create geospatial clusters. You can use geospatial tools to aggregate, cluster, partition, reduce, join, and index geospatial data. These tools improve query runtime performance, reduce stored data size, and visualize aggregated geospatial data. H3 is Uber’s Hexagonal Hierarchical Spatial Index. More information on how to chose the right algorithm can be found on the documentation for geospatial grid systems.


 


Visualizing "Kusto" on a map: https://gist.github.com/cosh/2c22c1ccf673f2af1dc44aa764887c4b#file-map-geojsonVisualizing “Kusto” on a map: https://gist.github.com/cosh/2c22c1ccf673f2af1dc44aa764887c4b#file-map-geojson


 


geo_point_to_h3cell()


Calculates the H3 Cell token string value for a geographic location (doc). Example:


 

print h3cell = geo_point_to_h3cell(-74.04450446039874, 40.689250859314974, 6)

 











h3cell
862a1072fffffff


geo_h3cell_to_central_point()


Calculates the geospatial coordinates that represent the center of an H3 Cell (doc). Example:


 

print centralpoint = geo_h3cell_to_central_point("862a1072fffffff")

 











centralpoint
{
“type”: “Point”,
“coordinates”: [-74.016008479792447, 40.7041679083504]
}

 


 


Geospatial cluster to polygon


This is something we wanted to add for some time. It’s the capability to calculate the polygon that represents the grid systems rectangular area.


 


geo_geohash_to_polygon()


Calculates the polygon that represents the geohash rectangular area (doc).


Geohash to polygon: https://gist.github.com/cosh/e9b5d65b813b4fe4fb7b8d4bd6386287#file-map-geojsonGeohash to polygon: https://gist.github.com/cosh/e9b5d65b813b4fe4fb7b8d4bd6386287#file-map-geojson


 


geo_s2cell_to_polygon()


Calculates the polygon that represents the S2 Cell rectangular area (doc).


S2 to polygon: https://gist.github.com/cosh/acbc84f82441f3d41a2b73e8b26d9ae9#file-map-geojsonS2 to polygon: https://gist.github.com/cosh/acbc84f82441f3d41a2b73e8b26d9ae9#file-map-geojson


 


geo_h3cell_to_polygon()


Calculates the polygon that represents the H3 Cell rectangular area (doc).


H3 to polygon: https://gist.github.com/cosh/2c22c1ccf673f2af1dc44aa764887c4b#file-map-geojsonH3 to polygon: https://gist.github.com/cosh/2c22c1ccf673f2af1dc44aa764887c4b#file-map-geojson