E-Documents as a Global Solution for Business Central

E-Documents as a Global Solution for Business Central

This article is contributed. See the original author and article here.

Business Central 2023 release wave 2 introduces a new global feature – Electronic Documents. Microsoft has crafted this as a foundational framework, providing a robust base for catering to localized requirements. This innovative approach allows Microsoft to efficiently deliver tailored localization apps for some countries. Furthermore, partners can leverage this model to craft their custom localizations. Given the unique e-document formats and distinct integration services prevalent in various countries, these localization apps are indispensable.

diagram

Why did Microsoft deliver it?

Before we delve into the world of E-Documents within Business Central, let’s acquaint you with an important new acronym – CTC, which stands for Continuous Transaction Control. This term signifies the imperative ‘real-time’ invoicing reporting and validation mandated by authorities. CTC encompasses a suite of digital control mechanisms to enhance tax collection and curb tax fraud. Electronic Invoicing stands as a key component of CTC, a model increasingly embraced by numerous countries.

We can easily conclude – this approach will become mandatory in many countries if it hasn’t already. The prospect of different solutions for each country poses challenges. Hence, we’ve developed one global extendable solution as an app, simplifying support for diverse countries.

Exploring the Scope of E-Invoicing Models

Understanding the landscape of E-invoicing can initially seem straightforward – creating an electronic file and transmitting it. However, the reality is more intricate. E-invoicing encompasses various models, including the 2-corner, 3-corner, and 4-corner frameworks. Each country retains autonomy to determine its preferred approach. Moreover, even in non-mandatory scenarios, businesses often opt for E-invoicing to streamline communication.

Our solution encompasses all the corner models mentioned here and facilitates additional messaging capabilities between access points, providing a comprehensive E-documents framework.

diagram

How do E-Documents Operate in Sales and Purchases?

E-Documents in Business Central facilitate seamless interaction in both sales and purchase processes. It’s a two-way system, enabling the transmission of electronic sales documents to customers while also receiving electronic documents from vendors. These electronic documents have their own distinct lifecycles, which may not always align with invoice timelines. To accommodate this, we’ve introduced a new entity, the E-Document, linked with the original document in Business Central. This entity hosts a unique information set, including statuses, logs, and potential error notifications or warnings.

Once the system is configured, posting a sales document triggers the automatic creation of an E-Document. Depending on your setup, it’s promptly dispatched to the designated service. You gain complete visibility into its status and can take additional actions as needed.

In the case of incoming purchase electronic documents, you have the flexibility to upload them to Business Central manually. However, if your access-point provider provides a document delivery service, you can configure a Job Queue for automated downloads and E-Documents creation. Here’s the magic: if you’ve mapped vendors’ items with yours through item-references or G/L accounts and there are no errors, the system will effortlessly generate purchase invoices with all the essential details. Your task? Just review and post them.

How to Expand This Functionality?

This framework has been created to speed up your productivity when building electronic invoicing applications by taking care of all the infrastructure work, like subscribing to different posting routines, writing custom mapping logic, logging, error handling, and running jobs in the background.

The framework is designed to improve your productivity in developing electronic invoicing applications. It handles essential infrastructure tasks such as subscribing to various posting routines, custom mapping logic, managing logs, handling errors, and running background jobs. This empowers you to direct your attention towards the specific electronic invoicing logic, including:

  • Exporting/Importing documents from Business Central to the local format mandated by the authority.
  • Establishing seamless integration with the authority’s endpoint for sending and receiving electronic documents.

To create your local E-Document:

  1. Create a new extension adding dependency to the E-Document Core application.
  2. Implement a document interface based on the specification mandated by the local authority, using designated endpoints for the sales: Check, Create, and CreateBatch, and GetBasicInfo and PrepareDocument when you expect to receive documents.
  3. Implement an integration interface to send/receive documents to the local authority automatically.
  4. Enhance user experience by implementing a setup wizard that gathers all necessary configuration details and obtains customer consent for data transmission. This streamlined process ensures a smoother onboarding experience for users.
diagram, schematic

More details with examples how to extend existing E-Document Core application can be found here.

The post E-Documents as a Global Solution for Business Central appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Solve case creation issues using the enhanced activity monitor tool 

Solve case creation issues using the enhanced activity monitor tool 

This article is contributed. See the original author and article here.

Digital contact centers need to create support cases automatically when they receive incoming emails, phone calls, and messages from customers. They rely on this automation to avoid manual efforts in creating cases so they can address customer issues promptly. The activity monitor view helps diagnose automatic case creation issues and ensures this automation runs smoothly. 

Today, Dynamics 365 Customer Service offers automatic record creation rules for creating support cases automatically based on the conditions defined by administrators. However, an incoming work item may not automatically convert to a case for multiple reasons. The activity monitor tool helps administrators diagnose those issues and provides the reason a work item was not converted to a case. Once they know the reason, administrators also need the system to provide them with suggestions on how to resolve the issues. Recent enhancements to the activity monitor tool provide recommendations to administrators with steps to avoid future case creation issues by making changes to the rule configuration.  

Now, administrators can view recommendations for each activity monitor event to see why case creation was skipped or failed and steps to resolve the issue. The activity monitor form for each event contains the resolution steps and direct links to the rule settings that they should change. This helps administrators diagnose and self-solve their issues quickly.  

Navigate to the activity monitor

In the Customer Service admin center, administrators can view the status of events for the past 7 days. They can see skipped, failed, and successfully processed events by automatic record creation rules with the status as Ready for Power Automate.

From here, administrators can navigate to the Activity monitor events for last 7 days view. This provides a grid view of the events processed in the past 7 days. It provides details like current state, rule name, and condition. Additionally, it contains the reasons and recommendations to resolve the issues if the state was skipped or failed. Admins can also view the Recommendations column in the existing All activity monitor events view. 

View the form for a specific activity

Administrators can double-click on any part of the event record to navigate to the Activity monitor form. The form shows the Actions section with the Recommendations field. This field explains why case creation was skipped or failed and the steps to take to resolve these issues. Some contain direct links to the Advanced settings of the relevant automatic record creation rule with suggestions for configuration changes. Administrators can directly navigate to these settings and make the required changes. Once they make the changes, any future work items that use automatic record creation rules will convert to cases. Note that the changes will not impact the work items sent to Dynamics before the admin changed the configuration.  

Learn more

Watch a quick video introduction.

Read the documentation: Manage activity monitor to review and track rules | Microsoft Learn 

The post Solve case creation issues using the enhanced activity monitor tool  appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Welcome to the 2024 Imagine Cup

Welcome to the 2024 Imagine Cup

This article is contributed. See the original author and article here.

MaddyEpstein_0-1697471041170.png


 


Welcome to the 2024 Imagine Cup, a student competition for visionary entrepreneurs building with AI. If you’re passionate about AI, technology, and entrepreneurship, this is your opportunity to unlock your startup’s potential and shine on the global stage. Here’s what you can expect on this exciting journey.


MaddyEpstein_1-1697471081656.png


 


Benefits of participating in Imagine Cup



  • Access to AI Technology: Immerse yourself in cutting-edge AI technology with USD1,000 of Azure credits and USD2,500 OpenAI credits. You’ll receive expert guidance to accelerate your startup’s AI journey, giving you a distinct advantage in crafting AI-driven solutions.


  • Expert Mentorship: Imagine Cup goes beyond mere technical guidance; it offers a comprehensive entrepreneurial ecosystem. Through personalized one-to-one guidance from technical and business mentors tailored to your specific needs as a founder, you’ll acquire the essential tools to hone your strategies and successfully navigate the challenges along your startup journey. From helping you leverage Microsoft’s industry-leading AI capabilities for your startup, to inspiring you with what is possible as an entrepreneur, your mentors will be trusted partners on your path to success.


 



  • Prizes and Global Recognition: Seize the opportunity to showcase your startup on a global stage at Microsoft Build! The Imagine Cup champion team will receive USD100,000, while the two runner-up teams will each earn USD50,000. Plus, the winning team secures an exclusive mentorship session with Microsoft Chairman and CEO, Satya Nadella!



Stages of the Competition


Now, let’s explore the stages of the 2024 Imagine Cup competition:


1. Qualifying: Build the Future with AI (Oct 16 – Jan 24)


Kickstart your Imagine Cup journey by submitting your innovative AI startup idea. Participants in this stage will gain access to Microsoft for Startups Founders Hub, receive USD1,000 in Azure credits, USD2,500 in OpenAI credits, access to self-paced training for technical and entrepreneurial skills. Remember, the earlier you submit to qualifying the earlier you get access to the credits to start building your Minimum Viable Product (MVP).


 


2. Minimum Viable Product (MVP) Submissions (Jan 26 – Feb 9)


This stage is dedicated to submitting your Minimum Viable Product (MVP), a mandatory step to advance in the competition. Learn more about what is required for your MVP through downloading the Imagine Cup Rules and Regulations.


 


3. Semifinals: Accelerate Your Growth (Feb 23 – April 12)


Congratulations to the teams who make it to the semifinals! In this stage, you will collaborate with seasoned mentors to refine your business plan and prototype, harnessing AI for your startup’s success. Semifinalists will enjoy networking opportunities, technical guidance, guidance for AI acceleration, personalized one-to-one technical and business mentorship, and access to Founders Hub Level 2, along with an additional USD4,000 in Azure credits.


 


4. World Championship: Showcase Your Business Globally (April 26 – Build)


Welcome to the pinnacle of Imagine Cup—the Imagine Cup World Championships! The top three teams compete on a global stage at Microsoft Build, showcasing the depth and promise of their startups. Step into the spotlight and demonstrate how technology can drive positive change.


This stage offers global stage visibility, the chance to win an exclusive mentorship session with Microsoft Chairman and CEO, Satya Nadella, and cash prizes of USD100,000 for the winning team and USD50,000 for the two runner-up teams.


How to get started


But that’s not all! As you embark on your Imagine Cup journey, consider starting with the Imagine Cup Cloud Skills Challenge. It’s your chance to learn essential AI, tech, and entrepreneurial skills for the competition.


Imagine Cup 2024 is your pathway to innovation, mentorship, and global recognition. Join us today and unleash your startup’s potential. Submit the proof of concept for your business by January 24. However, the sooner you apply, the sooner you gain access to your benefits.


 


Together, we’ll create a brighter future through AI-driven entrepreneurship. Along the way, connect with a vibrant community of changemakers and drive meaningful positive change through your journey as a founder.



Register for the 2024 Imagine Cup


 

Lesson Learned #444:Handling the “Row Value Expressions Exceeds Maximum Allowed” Error in SQL Server

This article is contributed. See the original author and article here.

Some days ago, we faced the following error message: “Msg 10738, Level 15, State 1, Line 2 The number of row value expressions in the INSERT statement exceeds the maximum allowed number of 1000 row values.” our customer is using ODBC Driver 18 for SQL Server and they got the previous message. With this complete error:  Error: (‘42000’, ‘[42000] [Microsoft][ODBC Driver 18 for SQL Server][SQL Server]The number of row value expressions in the INSERT statement exceeds the maximum allowed number of 1000 row values. (10738) (SQLExecDirectW)’).


 


Understanding the Error:


 


The error message you’re encountering, often referred to as “Row Value Expressions Exceeds Maximum Allowed,” occurs when you attempt to insert more than 1000 rows using a single INSERT statement. SQL Server has a built-in limitation that restricts the number of rows you can insert in one go to prevent performance and stability issues on the server.


 


Why Does It Happen?


This limitation exists to safeguard the server from processing excessively large insertions that could impact its performance negatively. By restricting the number of rows per INSERT statement, SQL Server can maintain a balance between data consistency and system resources.


 


Example of the script:


 

INSERT INTO MiTabla (ID, Edad) values
(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50),(1,50)

 


Solutions:


Now that we’ve established why this error occurs, let’s explore some potential solutions to address it:


 


Batched Insertions:


One straightforward approach to overcome this limitation is to divide your data into smaller batches and perform multiple INSERT statements. For example, if you need to insert 2000 rows, split them into two batches of 1000 rows each and execute two separate INSERT statements.

-- First batch
INSERT INTO MiTabla (ID, , Age)
VALUES
    -- 1000 rows here

-- Second batch
INSERT INTO MiTabla (ID, Age)
VALUES
    -- 1000 rows here

 


Use Temporary Tables or Staging Tables:


Another approach is to use temporary tables or staging tables to hold your data temporarily. You can insert your data into these tables in smaller chunks and then transfer the data to the target table using a series of INSERT INTO … SELECT statements.


 

-- Create a staging table
CREATE TABLE StagingTable (
    ID INT,
    Age INT
)

-- Insert data into the staging table in batches

-- Then transfer the data to the target table
INSERT INTO MiTabla (ID, Age)
SELECT ID, Age FROM StagingTable

 


Conclusion:


The “Row Value Expressions Exceeds Maximum Allowed” error in SQL Server serves as a safeguard to prevent excessive insertions in a single INSERT statement. By understanding why this error occurs and employing batched insertions or other strategies, you can efficiently manage and insert large datasets into your SQL Server database without encountering this limitation. You could find more information here: Table Value Constructor (Transact-SQL) – SQL Server | Microsoft Learn


 


 


 


 

Kusto NLog connector now supports Azure Data Explorer Free Clusters

Kusto NLog connector now supports Azure Data Explorer Free Clusters

This article is contributed. See the original author and article here.

Screenshot 2023-10-14 192356.png


Introduction


NLog is a popular logging framework used by developers to log messages from .NET applications. NLog, a robust logging framework, captures and stores critical information, providing essential insights for debugging and optimization.


 


Earlier in the summer we released the open-source NLog connector for Azure Data Explorer. Azure Data Explorer (ADX) is a fast and highly scalable data exploration service that can be used to store and analyze large volumes of data. 


 


In this blog post, we walk you through the latest enhancements in NLog connector for Azure Data Explorer which marks a significant leap forward in offering seamless integration with Azure Data Explorer for an unparalleled application behavior insights and monitoring.


 


In the latest connector update, a host of powerful features has been introduced. This includes seamless support for Free Azure Data Explorer Clusters and integration with KQL Database in Microsoft Fabric Real-Time Analytics. Now, developers can dive into the world of ADX without the need for an Azure account subscription or credit card. Simply create a free ADX account, and you’re all set to harness the full potential of your logs using an ADX cluster.


 


Leveraging the expressive power of Kusto Query Language (KQL), developers can uncover invaluable insights into their application’s behavior, performance metrics, error occurrences, and even detect anomalies. This newfound capability opens a world of possibilities for fine-tuning and optimizing applications based on data-driven decisions.


 


Moreover, bid farewell to the IngestionEndpoint property and embrace the power of Kusto Connection Strings. These strings serve as your all-access pass to a range of authentication modes, including the convenient User Prompt Authentication and the secure User Token Authentication.


 


Diving into the ADX Target Sample Application with latest updates


You can find the detailed initial setup steps in the previous blog post here: Getting started with NLog and Azure Data Explorer – Microsoft Community Hub


 


This will be like the previous blog but in context of Free Cluster and Connection string support.


 


Pre-requisites:



Steps


Create a table in Azure Data Explorer to store logs. The following command can be used to create a table with the name “ADXNLogSample”.


 

.create table ADXNLogSample (Timestamp:datetime, Level:string, Message:string, FormattedMessage:dynamic, Exception:string, Properties:dynamic)

 


Clone the NLog-ADX target git repo.


 

git clone https://github.com/Azure/azure-kusto-nlog-sink.git

 


Set the following environment variables in the sample application:



  • CONNECTION_STRING : Kusto ConnectionString of ADX cluster created.

  • Eg: Data Source=https://ingest-..kusto.windows.net;Fed=True

  • DATABASE: The name of the database to which data should be ingested into.


Simply build and run the application:



  • Install the ADX Target for NLog


The ADX Target for NLog is available as a NuGet package. To install it, open the Package Manager Console and enter the following command:


 

dotnet add package NLog.Azure.Kusto --version 2.0.1

 


Build the application and run it



  • Open a Powershell window, navigate to NLog ADX Target base folder and run the following command.   

    dotnet build


  • Once build got completed, navigate to src/Nlog.Azure.Kusto.Samples/ run the following command to run the sample application.

    dotnet run


  • This is making the sample application, open a login prompt in your default browser, where you need to enter your username and password.


asaharn_0-1697093574870.jpeg


 



  • The ingested log data can be verified by querying the created log table (ADXNLogSample in our case) by using the following KQL command.

    ADXNLogSample | take 10​



asaharn_1-1697093574880.png


Conclusion


The NLog Azure Data Explorer Target connector is a great tool for log management that allows developers to send their log messages to ADX for analysis and visualization. With the new features of the connector such as support for ADX free cluster and Microsoft Fabric cluster, developers can now use ADX without an Azure account or credit card. By using KQL queries, developers can gain insights into their application behavior, performance, errors, and anomalies. We hope this article has helped you understand how to use the new features of the connector and how to make sense of your logs using an ADX cluster.


 


Documentation: Ingest data with the NLog sink into Azure Data Explorer – Azure Data Explorer | Microsoft Learn


Open-Source Repository: Azure/azure-kusto-nlog-sink: Nlog custom target for storing logs to ADX (github.com)


NLog Kusto Connector Nuget: NuGet Gallery | NLog.Azure.Kusto 2.0.1


Cumulative Update #23 for SQL Server 2019 RTM

This article is contributed. See the original author and article here.

The 23rd cumulative update release for SQL Server 2019 RTM is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.
To learn more about the release or servicing model, please visit:



Starting with SQL Server 2017, we adopted a new modern servicing model. Please refer to our blog for more details on Modern Servicing Model for SQL Server


How Azure is keeping customers secure against the Rapid Reset DDoS Vulnerability

This article is contributed. See the original author and article here.

Bad actors can expose a new security vulnerability to initiate a DDoS attack on a customer’s infrastructure. This attack is leveraged against servers implementing the HTTP/2 protocol. Windows, .NET Kestrel, and HTTP .Sys (IIS) web servers are also impacted by the attack. Azure Guest Patching Service keeps customers secure by ensuring the latest security and critical updates are applied using Safe Deployment Practices on their VM and VM Scale Sets.  


 


As the latest security fixes are released from Windows and other Linux distributions, Azure will apply them for customers opted into to either Auto OS Image Upgrades or Auto Guest Patching. By opting into the auto update mechanisms through Azure, customers can remain proactive against security issues rather than reacting to attackers. Customers not leveraging the auto update capabilities through Azure Guest Patching Service are recommended to update their fleet with the latest security updates (KB5031364 for Windows and fix for CVE-2023-44487 related to Open-Source Software distributions).  


 


Without the latest security updates, organizations risk exposing their systems and data to potential security threats and web attacks. It is important for organizations to plan for this update to avoid any disruption to their business operations.  


 


Microsoft recommends enabling Azure Web Application Firewall (WAF) on Azure Front Door or Azure Application Gateway to further improve security posture. WAF rate limiting rules are effective in providing additional protection against these attacks. See additional recommendations from Microsoft Security Response Center for this vulnerability.  


 
Enabling Auto Update Features: Azure recommends the following features to ensure VM and VM Scale Sets are secured with the latest security and critical updates in a safe manner: 


 


Auto OS Image Upgrades: Azure replaces the OS disk with the latest OS Image. Supports rollback and rolls the upgrade across scale sets throughout all the regions. 


Auto Guest Patching: Azure applies the latest security and critical updates to an asset and rolls the update across the fleet throughout all the regions.  


 


The recent announcement of a new security issue is an important reminder for organizations to stay current with their software solutions to avoid any security or performance issues. Azure continues to keep customers secure by rolling out the latest security updates through multiple mechanisms for VM and VM Scale Sets in a safe manner. Customers are recommended to leverage the auto update capabilities in Azure to ensure they remain proactive against bad actors.  

Introducing multiple recurrence support for the work hour calendar in Universal Resource Scheduling (URS)

Introducing multiple recurrence support for the work hour calendar in Universal Resource Scheduling (URS)

This article is contributed. See the original author and article here.

The work hour calendar multiple recurrence feature is a new URS functionality that allows you to create and manage work hour calendars with more flexibility and efficiency. You can now define multiple recurrence patterns for your work hour calendar events, such as daily, weekly or monthly, and specify different start and end dates for each pattern. This way, you can easily accommodate different work schedules, holidays, and special events in your organization.

The new multiple recurrence feature in the upcoming V2 work hour calendar can help you to:

  • Add multiple recurrences within a single day, to represent different instances of recurring shift work e.g. morning, afternoon and evening shifts in a single day with different recurrences
  • Have overlapping recurrences within a week e.g. A recurrence for Mon and Wed, and a recurrence for Tues. Previously the Tuesday recurrence would have deleted the Mon and Wed entries; now they can coexist alongside each other.
  • Input work hour events in different timezones, which is helpful for workers who travel. Previously, the calendar supports only one timezone across all work hour calendar events.

What are work hour calendar events and why are they needed?

Work hour events define when a resource is available to perform work, and they exist as 2 types:

  • Occurrences (one-time events) are work hour events that happen only once on a specific date and time. Occurrences always take priority over Recurrences. E.g. team cohesion days, seminars or emergencies.
  • Recurrences (repeating events) are work hour events that repeat on a regular basis according to a pattern and frequency. E.g. rotational shift work, weekly cadences, monthly client visits

Occurrences and recurrences can be used today in URS to define different types of work hours, such as working hours, non-working hours, breaks and time off.

How did URS handle work hour events before (V1 work hour calendar)?

Before this update, only 1 recurrence event is supported per calendar day, for a given date span.

Scenario 1, Jane is a doctor who does shift work at various clinics:

  • Recurrence 1 (morning shifts): 7am-12pm UTC, repeats Mon, Tues, Wed
  • Recurrence 2 (afternoon shifts): 1pm-5pm UTC, repeats Tues, Wed, Thu
  • Recurrence 3 (night shifts): 7pm-11pm UTC, repeats Wed, Thu and Fri

The old work hour calendar does not support more than 1 work hour event per calendar day, so this scenario would not be supported

Scenario 2, John is a utilities engineer with different work hours on alternating days:

  • Recurrence 1: 8am-5pm UTC, repeats Mon, Wed and Fri
  • Recurrence 2: 6am-8pm UTC, repeats Tues and Thu

Adding both recurrences was not supported in the old work hour calendar; Recurrence 2 would have deleted the Mon, Wed and Fri entries from Recurrence 1 for a given date span.

Scenario 3, Becca is a travelling salesperson who works in both Seattle and Singapore:

  • Recurrence 1 (work in Seattle): 8am-5pm PT, repeats all days of the week
  • Recurrence 2 (work in Singapore): 8am-5pm SGT, repeats all days of the week

Adding both recurrences of different timezones was not supported in the old work hour calendar.

How does URS handle work hour events now (V2 work hour calendar)?

The new V2 work hours calendar now follows the following logic:

  • Occurrences have a higher priority than Recurrence rules for a given calendar day. So if there were two rules (one occurrence and one recurrence) on the same day, the daily occurrence or time-off occurrence will take the priority over the weekly recurrence for the entire calendar day.(Unchanged from previous)
  • When there are multiple recurrences within the same date span:
    • If the times do not intersect, they will both remain on the calendar
    • If the times conflict, the rule that was most recently created/modified will be the one that is considered for the resource’s calendar. All other conflicting rules in the date span will be removed. In the event that some recurrences have conflicts on some dates but not on others, the rule will be spliced to retain the non-conflicting events, while removing the events on dates that do have conflicts.

Scenario 1, Jane is a doctor who does shift work at various clinics:

  • Recurrence 1 (morning shifts): 7am-12pm UTC, repeats Mon, Tues, Wed
  • Recurrence 2 (afternoon shifts): 1pm-5pm UTC, repeats Tues, Wed, Thu
  • Recurrence 3 (night shifts): 7pm-11pm UTC, repeats Wed, Thu and Fri

Create Recurrence 1, Recurrence 2, then Recurrence 3 in succession. All will now show up on the calendar as seen below

Scenario 2, John is a utilities engineer with different work hours on alternating days:

  • Recurrence 1: 8am-5pm UTC, repeats Mon, Wed and Fri
  • Recurrence 2: 6am-8pm UTC, repeats Tues and Thu

Create Recurrence 1, then create Recurrence 2 in succession. Both will now show up on the calendar as seen below:

Scenario 3, Becca is a travelling salesperson who works in both Seattle and Singapore:

  • Recurrence 1 (work in Seattle): 8am-5pm PT, repeats all days of the week
  • Recurrence 2 (work in Singapore): 8am-5pm SGT, repeats all days of the week

As seen above both Seattle and Singapore work hours are both easily expressed on the V2 work hours calendar. Note that the Singapore work hours are shifted to match the dispatcher’s Timezone i.e. Pacific Time – the timezone of the calendar itself is visible at the bottom left of the calendar, and the dispatcher can be altered this in <Personalization Settings>.

What else has changed/remains unchanged?

Previously in the V1 Work Hour Calendar, only 1 recurrence is allowed per calendar day, thus adding any new recurrences will completely override the work hour events for that calendar day.

With the V2 Work Hour Calendar, the previous work hour events will only be overridden if there is a direct conflict in time between the 2 recurrences.

For instance, Joel is an equipment technician with the following work hours:

  • Recurrence 1 (regular work hours): 9am-5pm PT, repeats all days of the week
  • Recurrence 2 (temporary work hours): 1-9pm PT, only from Jul 10-14

Create Recurrence 1, then create Recurrence 2 in succession. As seen below, Recurrence 2 overrides Recurrence 1 for the Jul 10-14 period because there is a direct conflict between the recurrences. All other work hour events remain.

The following dialogue will now appear whenever a new work hour event is added, in order to remind users of this behavior:

Occurrences remain unchanged from the previous V1 calendar i.e. Occurrences always take priority over Recurrences and will override recurrences for the entire day.

For instance, Duke is an equipment technician with the following work hours:

  • Recurrence 1 (regular work hours): 9am-5pm PT, repeats all days of the week
  • Occurrence 1 (team cohesion): 6-9pm PT, only on Aug 1

Create Recurrence 1, then create Occurrence 1 in succession. As seen below, Occurrence 1 completely overrides all other work hours events for the Aug 1 calendar day even if there is no direct collision between the Recurrence and the Occurrence.

When will the V2 Work Hour Calendar be available, and how can I get my hands on it?

The V2 Work Hour Calendar will be available early-September 2023 in our Early Adoption Wave 2 update. You can opt in through Power Platform Admin Center, as seen below:

How can I find out more?

If you want to learn more about the new work hour calendar multiple recurrence feature, you can:

Read the documentation here: Edit work hour calendars by using APIs in Dynamics 365 Field Service – Dynamics 365 Field Service | Microsoft Learn

Join the community forum here: https://community.dynamics.com/

Contact the support team here: https://support.microsoft.com/en-us/contactus/

We hope you enjoy the new work hour calendar multiple recurrence feature and find it useful for your business needs. We appreciate your feedback and suggestions on how to improve our products and services. Thank you for choosing Dynamics 365!

The post Introducing multiple recurrence support for the work hour calendar in Universal Resource Scheduling (URS) appeared first on Microsoft Dynamics 365 Blog.

Brought to you by Dr. Ware, Microsoft Office 365 Silver Partner, Charleston SC.

Microsoft Syntex adds taxonomy and image tagging, OCR, content query, annotations, and more

Microsoft Syntex adds taxonomy and image tagging, OCR, content query, annotations, and more

This article is contributed. See the original author and article here.

Welcome to the fall! This month’s Microsoft Syntex update is gearing up to be a great one in the world of content processing. We have updates on Syntex taxonomy tagging and image tagging; a set of Syntex capabilities coming to preview for pay-as-you-go users; the general availability of the Syntex optical character recognition (OCR) is expanding to include PDF and TIFF support; and lastly, both Syntex OCR and Syntex structured document processing are moving to general availability.


 


Syntex taxonomy tagging and Syntex image tagging in general availability



In our previous blog post, we shared that Syntex taxonomy tagging and image tagging were rolling into general availability. We’re happy to share that both services have now completed rollout and are generally available to all Syntex pay-as-you-go users.


 


As a refresher, Syntex Taxonomy Tagging uses AI to help you label and organize documents by automatically tagging them with descriptive keywords, based on your taxonomy defined in SharePoint. By applying a taxonomy column and enabling taxonomy tagging, the document is automatically tagged with keywords from your term store to help with searching, sorting, filtering and more. This reduces manual work, and makes it faster and more efficient to categorize, find, and manage files in your document libraries. Overview of taxonomy tagging in Microsoft Syntex – Microsoft Syntex | Microsoft Learn


 


MicrosoftTeams-image (91).png


Taxonomy Tagging – the location column auto-populates based on your term store in this example


 


Syntex Image Tagging is now also generally available. Image Tagging is an AI-powered service that helps you label and organize images by automatically tagging them with descriptive keywords. These tags are stored as metadata to optimize searching, sorting, filtering, and managing your images. With this Syntex service, it’s much faster to categorize and search for specific images that you need. Overview of enhanced image tagging in Microsoft Syntex – Microsoft Syntex | Microsoft Learn


 


MicrosoftTeams-image (92).png
Image Tagging – images are auto-tagged with descriptive keywords



New features in preview for Syntex pay-as-you-go users



We’re excited to share that, for a limited time, Syntex pay-as-you-go users now get to use all the Syntex features previously only available to customers with the SharePoint Syntex seat license. If you’re not yet a Syntex customer, now is a particularly great time to give it a go. These services will be available as a preview through June 30, 2024.



1. Content query – an advanced, powerful search with custom metadata in a form-based interface
2. Universal annotation – add ink and highlights to additional file types like PDF & TIFF supported by our file viewer
3. Accelerators – preconfigured templates that leverage Syntex capabilities in an end-to-end solution for common scenarios like contract management and accounts payable
4. Taxonomy services – admin reporting on term set usage, easy import from SKOS-formatted taxonomies, and the ability to push a content type to a hub
5. Content processing rules – lightweight automation for common operations such as moving or copying a file, and setting a content type from the file name or path in SharePoint
6. PDF merge/extract – combine two or more PDF files into a new PDF file, or extract pages from one PDF into a new one


 


MicrosoftTeams-image (93).png


Site accelerator – preconfigured site templates for Accounts Payable


 


For Syntex pay-as-you-go customers, these capabilities will be readily available to you without taking any action or set up and without any additional charge. Microsoft Syntex features limited time license – Microsoft Syntex | Microsoft Learn



Syntex OCR and structured document processing will be generally available this month



Lastly, Syntex optical character recognition (OCR), which was previously in public preview, will be generally available this month! In images containing text – such as screenshots, scanned documents, or photographs – Syntex OCR automatically extracts the printed or handwritten text and makes it discoverable, searchable, and indexable.


 


It can be used for image-only files, now including PDF and TIFF as mentioned in the introduction, in OneDrive, SharePoint, Exchange, Windows devices and Teams messages. Searching for images is improved thanks to OCR, and IT admins can better secure images across OneDrive, SharePoint, Exchange, Teams and Windows devices with data loss prevention (DLP) policies.


 


MicrosoftTeams-image (94).png


Optical Character Recognition (OCR) auto-extracts text from images


 


You will also be able to use both the Syntex structured and freeform document processing features later this month when they become generally available as a new pay-as-you-go meter called “Structured document processing”. Unlike in the past, you will be able to use these services with your Azure subscription – no per-user license required, no AI Builder credits needed (but if you want to use AI Builder credits, we will still support that as well). Microsoft 365 roadmap ID 167309. Get started here.



Stay connected



And there you have it, lots of updates on Syntex services that will help your organization manage your content and improve content discovery, with less redundancy and greater efficiency. To get the latest on Syntex, join our mailing list for updates, and register for the upcoming October 18th Syntex Community Call.


 


Be sure to also connect with us at Microsoft Ignite, November 14-17, 2023, in Seattle or virtually!

Realize Lakehouse using best of breed of Open source using HDInsight

Realize Lakehouse using best of breed of Open source using HDInsight

This article is contributed. See the original author and article here.

Author: Reems Thomas Kottackal, Product Manager


 


HDInsight on AKS is a modern, reliable, secure, and fully managed Platform as a Service (PaaS) that runs on Azure Kubernetes Service (AKS). HDInsight on AKS allows an enterprise to deploy popular open-source analytics workloads like Apache Spark, Apache Flink, and Trino without the overhead of managing and monitoring containers.


 


You can build end-to-end, petabyte-scale Big Data applications spanning event storage using HDInsight Kafka, streaming through Apache Flink, data engineering and machine learning using Apache Spark, and Trino‘s powerful query engine. In combination with Azure analytics services like Azure data factory, Azure event hubs, Power BI, Azure Data Lake Storage.



HDInsight on AKS can connect seamlessly with HDInsight. You can reap the benefits of using needed cluster types in a hybrid model. Interoperate with cluster types of HDInsight using the same storage and meta store across both the offerings.

The following diagram depicts an example of end-end analytics landscape realized through HDInsight workloads.


 


sairamyeturi_0-1696952393320.png


 


 


We are super excited to get you started, lets get to how?